How To Use Multi Choice People Picker Fields With Microsoft Flow Approvals

Users looking to assign an approval to multiple people in Power Automate (or Microsoft Flow) using a Multi Choice People Picker field in a SharePoint list, may find that it isn’t as straight forward as it might seem. When users feed the field value into any approval step, it will throw it into a for each loop (as shown below).

A challenge arises since users now face an approval being individually assigned to each person defined in the list item. The goal is to have one main approval and if someone in that list approves or denies it, the approval process is completed. To do this, users need to build a string array separated by semicolons and feed that into the “Assigned To” input. First, initialize a variable called “Approvers” and set it’s type to “String”.

Next, set up a for each loop based on the Approvers field in the SharePoint list and append each Approver Email to the string variable using the “Append to string variable” action. At the end of the variable, simply type in a semicolon so the completed string will be the email with a semicolon at the end.

Lastly, users can recreate an Approval step and feed it the new Approvers variable.

With this, users will notice notice that it is no longer thrown into a for each loop and they will get one single Approval tree to work with.

Microsoft Power Automate Flow Trigger Conditions

It’s common to use the ‘When an item is created or modified’ trigger when creating Flows for SharePoint with Power Automate. This can be a very chatty trigger as every change can result in the Flow executing. Users can utilize Conditions, Scopes, and Run After settings within the Flow logic to determine if they should really act on an item. This still results in yet another unhelpful entry within the Run History.

Leveraging Trigger Conditions offers the option to check Trigger Properties and additional logic to determine if the Flow should run at all. Users can continue to use the same Trigger while streamlining both Flow logic and Run History.

Trigger Conditions are found by selecting the menu from the three dots or ellipsis in the upper right-hand corner of the Trigger Card. Once there, look to Settings, then at the bottom, Trigger Conditions. Users can add more than one Condition and that the Trigger will only run when the Conditions of which at least one must evaluate as True.

Since the Dynamic Content menu isn’t available at this point, users must use the available Functions. Whatever expression is entered should return a Boolean value. A non-Null value like an Object or Number won’t allow the Flow to run at all.

Expression Result Type Valid Condition
@add(1, 0) Integer No
@true Boolean Yes
@equals(1, 1) Boolean Yes
@json(triggerBody()) Object No

Note that List One which has two Content Types with differing Fields. Item has only the Title column and Item 2 adds the Example column.

Content Type Fields
Item Title
Item 2 Title, Example

Normally, users would have no means of filtering processed items within the Trigger with When an item is created or modified as no OData filtering is provided. By using the following Trigger Condition, users can ensure Flow doesn’t process any items where Example is missing data.

Two Compose actions are added to show the values of the Content Type and Example properties of the triggering items. When the Test functionality is used, three scenarios can be run:

Item Content Type:

Create a new Item which has only the Title column; no processing should occur.

Item 2 Content Type Without Example Value:

Create a new Item 2 which has both Title and Example columns, complete only the Title; no processing should occur.

Item 2 Content Type With Example Value:

Create a new Item 2 which has both Title and Example columns, complete both; the Test should now fire and process the Trigger.

Here are the results showing the Item 2 Content Type and the text from the completed Example field:

Alternately, we could use the Content Type itself as the Trigger Condition:

@equals(triggerBody()?[‘{ContentType}’]?[‘Id’], ‘0x0100ACFF228D0E467842B04850DDAE19C31C00BBAE74759D28534CB0A8EEAFC9908541’)

Trigger Conditions can be grouped to create complex AND/OR logic. While adding multiple conditions acts as an AND, where all must be True, there is no UI method for OR. Adding @false as a secondary condition illustrates this as if we run a Test — it will never trigger. 

Fortunately, Microsoft has provided both and() and or() logical comparison functions, allowing the introduction of OR in a single condition.

@or(equals(triggerBody()?[‘{ContentType}’]?[‘Id’], ‘0x0100ACFF228D0E467842B04850DDAE19C31C00BBAE74759D28534CB0A8EEAFC9908541’), not(equals(triggerBody()?[‘Example’], null)))

With this condition, the Flow would fire only if either the Content Type is Item 2 OR Example isn’t empty. This would run anytime the Content Type was Item 2, regardless of the value for Example, as well as for any other Content Type so long as Example isn’t empty. Tests do not fire when the Trigger Condition isn’t met, which means no more Run History entries when no real processing occurs. 

Condition Breakdown:

@: Allows the use of a Function within an Expression (anytime it isn’t entered via the Dynamic Content / Expression menu).

or(<expression1>, <expression2>, …): Return true when at least one expression is true. Return false when all expressions are false.

equals(‘<object1>’, ‘<object2>’): Return true when both are equivalent. Return false when not equivalent.

not(<expression>): Return true when the expression is false. Return false when the expression is true.

[, ], and ?: These operators are used to navigate data structures. In addition to accessing indexes in an array, the square brackets also allow access to Properties or Keys.

For users currently filtering on Trigger properties with Action Cards, it’s recommended to use Trigger Conditions instead. Users can even leave your existing logic in place as a matching condition would ensure that the same values are passed. They work particularly well in instances where Flow may trigger itself, such as updating a column value when a List Item is modified. In organizations with complex Flows or a large inventory, Trigger Conditions can cut down on Nesting Depth, Actions Per Flow, and Flow Executions — all of which contribute to staying within service limits and getting the most value from your subscriptions.

How to Initiate Flows from PowerApps

Flows and PowerApps go hand in hand recently. They can transform many different business processes such as Employee Onboarding/Offboarding, Nominations, Permissions, Site Creation, etc.

A great way to utilize Flows and PowerApps in tandem is to allow a PowerApp to initiate a Flow and feed the Flow that information which is already provided in the PowerApp Form.

Creating the PowerApp

In this example, we are going to create a simple blank PowerApp that will have two fields: Subject and Body.

If you haven’t guessed already, the PowerApp will contain a button that when clicked will initiate a Flow that consumes the information entered by the user into the Subject and Body fields and send an email to whoever the Flow has defined as a recipient (you could theoretically go a step further here and define the recipients in the PowerApp but I’ll leave that up to you).

Your new PowerApp should contain at a minimum two Text Input fields. You can make one of them multi-line if you’d like for the Body. Ours looks like the image below.

Next, we will add a simple button to the PowerApp labeled “Send Email”.

With the button highlighted/selected, navigate to the “Action” tab and click on “Flows”.

You’ll notice in our example, one Flow already appears. This is because any Flow you have created in this environment that has a plain trigger of “PowerApps” will be consumable in the PowerApps themselves.

Creating the Flow

For this example, click on “Create a new flow” to start the Flow Creation process. We want to make a simple two-step Flow so click on the first template available which is the PowerApps to Custom Action flow template.

Next, we will want to create a single new step called “Send Email”. You can define yourself or anyone you’d like as the recipient and when clicking on the “Subject” or “Body” property, you will be given the dynamic content property of “Ask in PowerApps”.

If you use that property for both values, you should then be presented with something like the image shown below.

Setting Up The Action Button

Save your Flow and head back to the PowerApp we created earlier.

If you close and re-open the “Flows” panel, your new Flow should now be available. While you still have the “Send Email” button selected, click on our new Flow to add it to the button. Once added, you will notice that the “OnSelect” property has been filled in with the action of “SendEmailfromPowerApp.Run(“.

The name of this action may vary based on what you named your Flow. You will also notice that it is asking for two parameters. These parameters are the Subject and Body we asked for in the Flow.

We can now feed the Text value from the first Text Input in for the first value and the same for the second.

With this, the PowerApp and Flow should now be integrated. If you run the app, fill in the values and click on “Send Email” your Flow should kick off emailing your recipient with the values you input for Subject and Body.

Read Next: How to Create Filtered Relational Drop-downs with Choices in PowerApps

 

6 Takeaways, Trends, and Observations from SPTechCon 2019

The successful completion of another SPTechCon Boston is under our belt.

As I like to do after these events, I stop and think about the people I talked to in the industry, the problems, and challenges they are encountering with various aspects of Office 365 tools and their businesses, plus what they hoped to achieve by attending the conference.

One of my main goals at SPTechCon is to speak with attendees and understand why they came to the event.  Most people come with some problems or ideas and hope to get more direct-human feedback on technology issues. 

Here are the big takeaways, challenges, and trends that I uncovered from conversations throughout the week:

Takeaways and Trends from SPTechCon 2019

1. Collaboration and Automation Are The Future

Microsoft Teams, Flow, PowerApps are the big winners over the past year and continue to lead today into the future. The desperate need for a powerful workflow engine to replace classic SharePoint features has put us in the driver’s seat to create real-world applications in SharePoint Online.

Microsoft Teams has been such a needed addition to the toolset that it’s no surprise how quickly it has taken off.  Users have switched over almost entirely to this tool for communication and collaboration, relying on classic SharePoint features for more process-based on document and information management.

2. Change is Constant in the Office 365 Ecosystem

Office 365 users have a difficult time making sure their team (often a team of one) can keep track of the features being updated and introduced by Microsoft. 

From small changes that derail a training exercise to whole new applications becoming available to an entire organization, changes to the platform are constant and can have a huge impact on how teams do business.

We, collectively, haven’t fully converted from the classic installation, service pack, and major revision model of yesterday. In the past, users and admins could review the release notes, understand the changes, test the installation, and release several feature changes on our schedule. Now, enhancements come in an unsteady stream.

This is a scary proposition to those who try to build and maintain the confidence of business while offering cutting edge capabilities.

It’s one of the reasons I publish the Timlin Office 365 Monthly Buzz Newsletter every month with updates, trends, and goings-on related to Office 365 and SharePoint. If you’d like to receive it, sign up here.

3. Understanding the Longer Microsoft Roadmap is Challenging at Best

Microsoft has never done a great job at helping us build a 3-5-year technology plan based on their feature and release roadmap – the guidance just isn’t cohesive enough. 

For example, if we knew that InfoPath and Workflow were going to be abandoned and replaced with PowerApps and Flow, we could have planned for it in advance. Also, the new versions of these tools don’t have feature parity, so they aren’t completely compatible as replacements.

This is a frustrating proposition for CIOs and technology leaders to recommend a plan, not knowing if the technology will be abandoned, what its possible replacement may be, and if something else will take its place within 2-3 years. 

The strategy, training, and political clout required to correctly implement these tools are too high to guess and hope.

4. Using Strategy and Governance Helps Identify the Right Tool(s) for the Job

The features in the Office 365 Suite contain several overlapping and interconnected capabilities.  Organizations struggle to understand how set the proper guidance, support, and train people down a path they are comfortable managing.

When there are too many ways to manage tasks, it becomes almost impossible to severely limit the choices, so many organizations turn to a “free for all” approach.  This methodology can increase initial user adoption, but almost always creates major problems if the platform takes off.

Organizations should consider when to use which tools and how to set the stage to provide solutions to their internal business problems in a well-orchestrated capacity. Information architecture, business analysis, governance, training, and ongoing support are all crucial to the success of user adoption and achieving digital transformation. 

5. Guaranteeing Proper User Adoption Isn’t Easy

Hopefully, you are noticing a theme here.  Most of these issues stem from similar problems.  Without the time or resources, you often have one of two paths to choose, or possibly both paths at the same time:

  1. Pick and choose high-value problems/solutions and solve for those.
  2. Open the spigot and let people use the tools in whatever way works for them.

Both approaches have their pros and cons, but I talked to a lot of frustrated business analysts and administrators that were expected to make these tools useful for thousands of people with no help beyond their knowledge of the platform.  This is not a good recipe.

6. Limited Time and Resources for Management and Maintenance are Commonplace

The obvious final piece of the puzzle is there isn’t enough time to make the impact these professionals want to make on their businesses.

For example, if an organization decides it needs a new ticket tracking system, it will create a team, spin up the plan, work it through to completion, provide training, support, and ongoing resources for management.  They will then require that all new tickets come through this system, thus ensuring its viability.

When organizations start using Office 365, they treat it much differently.  They bought it for email, Microsoft Office, and possibly OneDrive, and go into it assuming these are essentially desktop/individual tools.

The mindset and business approach to implementation are entirely different. Unlike a mandated/required ticketing system, many of the capabilities and solutions within Office 365 are not a pure requirement to complete work.

Instead, they are optional tools designed to create efficiency and error reduction.  You must think ahead, build solutions, and entice or require people to perform certain activities within these tools to solve specific problems.

The Roadmap Ahead In The Office 365 Industry

As you will note from these takeaways, the world, and the businesses that thrive within it, are changing.

Where we used to work so hard to create process, efficiency, and predictability, the new methods of succeeding are based on adaptability, flexibility, and some bravery to embrace and accept that the world around will be adjusting a pace that we’ve never seen before. 

Information (and misinformation) is given to the entire world in seconds, ideas, concepts, and features show up without warning. 

The classic IT mentality has been tested and appears to be failing in a world that needs something different.

In conclusion, Office 365 and SharePoint continue to help organizations harness the power of technology to improve operational efficiency. As with any technology that has numerous, regular updates, it can be challenging to keep up, though, but it’s worth it.

Reach out if you need any help as our team is very well versed in all of Office 365’s tools and capabilities.


The Ultimate Guide to SPTechCon Boston 2019

Next week, the annual SPTechCon Boston Conference is returning for another exciting year of training, problem-solving, and networking. 

SPTechCon will cover a wide range of SharePoint and Office 365 topic areas and attendees will walk away with practical knowledge that they can apply immediately within their organization We’re excited to be a Gold Sponsor of the conference again this year.

There’s a lot happening during the week so read through this guide to make sure you attend the most important sessions and meet the right people while you’re in Boston for SPTechCon. Get all the important details, dates, and insider tips below. 

SPTechCon Schedule At A Glance

Sunday, August 25th: First day of the conference. Tutorials, Hands-on lab, happy hour.

Monday, August 26th: Technical sessions, sponsored sessions, Microsoft Keynote, networking breaks, round tables, reception in Exhibit Hall (at 5:45 pm). 

Tuesday, August 27th: Technical sessions, general sessions, networking breaks, prize announcement in Exhibit Hall, SharePoint.

Wednesday, August 28th: Technical sessions & general sessions. Conference closes. 

For a complete list of sessions and descriptions click here.

Don’t Miss These Exciting Events And Sessions

Office 365 Hands-On Kitchen

When: Sunday, August 25th — 9 am – 5 pm

Join a select group of “chefs” (speakers) as they create recipes for collaboration challenges with cooks (you!). There will be up to five teams led by two master chefs to guide them through the solution cooking process using all the ingredients available in Office 365!

Planning a Successful Migration to Microsoft Teams and SharePoint Online

When: Monday, August 26th — 9:15 – 9:45 am

Senior Consultant of Timlin Enterprises, Nick Bisciotti, is sharing his top tips for executing a successful migration to Microsoft Teams and SharePoint Online. During this presentation, you will develop a plan and identify tools to make your migrations a smooth and seamless process.

Microsoft Keynote on Monday Morning 

When: Monday, August 26th — 10 am

Join Dan Holme, Director of Product Marketing, as he shares the latest innovations and solutions for content collaboration, security, teamwork, process transformation, employee engagement & communications, and knowledge sharing & discovery. Learn how the experiences in Microsoft 365–including SharePoint, OneDrive, Yammer, Stream, PowerApps, Flow–integrate to power collaboration and the intelligent workplace across devices, on the web, in desktop and mobile apps, and in the hub for teamwork, Microsoft Teams.

Stump the Experts – Win a Microsoft Surface Go!

When: Tuesday, August 27th — 5 pm

Timlin is hosting this flagship SPTechCon event for the second year in a row! Ask clever, challenging questions of Microsoft experts. The person with the best question will win a Microsoft Surface Go!

This will be an open discussion where you can test your knowledge against some of the best, discover answers to troubling SharePoint and Office 265 topics, and take your turn at winning this awesome prize. 

#TimlinTrivia — Join Us On Twitter

Join us every morning on Twitter from Monday to Wednesday for a round of #TimlinTrivia!

Before the morning keynote each day, we will tweet a tricky Office365 or SharePoint question. The first attendee to respond with the correct answer will win an Amazon gift card!

How To Get the Most Out Of Your SPTechCon 2019 Experience

  • Preparation is key! Be sure to draft your conference schedule in advance and highlight the events that interest you the most.
    Leave some free time during the day to recharge, grab a bite to eat, and network with others. Conferences are often jam-packed with back to back sessions and information, so it’s important to soak in as much information as you can without burning out by the end of each day.
  • Find the right sessions for you with filters. Use the robust filtering system on the SPTechCon program agenda to identify the right sessions to add to your schedule. You can filter by topic, session type, session category, level, and date.
  • Set goals for yourself and your experience. Decide on whether your priority is networking or building your skillset, and make that your priority during each day. Make a list of things you’d like to learn and people you’d like to connect with. Discover the complete list of this year’s speakers
  • Meet the many sponsors and companies in the industry who are changing and challenging the status quo with their Office 365 and SharePoint solutions. Meet the Timlin Enterprises team and learn more about our Center of Excellence approach by stopping by booth #209. 
  • Join in on the fun on social media. A great way to network and connect with the conference is by chatting on Twitter under the official conference hashtag #SPTechCon

We’re excited to experience the 2019 SPTechCon with you! Let us know if you’d like to connect during the conference by sending us a message on Twitter at @TimlinEnt.

How to Fix Timestamp Issues Between Exchange and SharePoint in Microsoft Flow

microsoft flow tutorial

Microsoft Flow is often used so users can consume emails and post their content and attachments into a SharePoint list for processing.

The date and time that an email was received are common metrics that users would like to track in SharePoint lists, and the metadata on the email contains that information. However, when you try to use this field, the date and time that flows into the SharePoint list isn’t the date and time the email was received.

Why Is There an Automatic Timestamp Issue in SharePoint? 

The problem stems from the variation in date formatting.

The date and time format that comes back from Exchange emails will resemble the following: 2019-07-02T17:10:36+00:00.

To compare, the date and time format that will be stored in a SharePoint list defaults in the following format: 2019-07-02T17:10:36.0000000Z.

Due to the difference in formatting, if you simply try to store the date and time from Exchange in a SharePoint list without converting it, you will end up getting completely different times than what you would expect.

How to Fix Timestamp Issues with Microsoft Flow 

The easiest way to solve this problem is to simply convert the time we receive from the email into the SharePoint friendly format. To do so, we can use the “Convert time zone” action in Microsoft Flow.

By default, the time zone that items come back in is the UTC Coordinated Universal Time and then it is converted into the end-users local time zone from that base.

With this information, you can convert the Base Time (the email’s received time) into the “Round-trip date/time pattern”. Our source time zone and destination time zone will remain the same as we do not want the actual timestamp itself to change.

The output from this action can then be used to store the proper timestamp in a SharePoint list item as shown below and from there the issue has been resolved.

If you have any questions with this process, let us know. Also, if you work a lot with Microsoft Flow, check out one of our most popular blogs entitled How Substring Works in Microsoft Flow.


How to Use Custom Vision Intelligence with Microsoft Flow to Recognize, Identify, and Index Your Pictures, Videos, and Digital Content

Recently, we came across a problem with a constructed Microsoft Flow that takes incoming emails to a Shared Mailbox and posts the information into a SharePoint list. The Flow picks up on attachments and stores those as attachments in the list item created.

The issue: the majority of people include the company logo in their signature, but the name of the file itself would vary, so there was no way to easily filter out the logo in order to not store it as an attachment. This is where the Azure Custom Vision Intelligence came in – you can easily customize your own computer vision models that fit with your unique use cases.

Here are a few examples of labeled images and how Custom Vision can do the work once it’s set up properly.

Custom Vision Intelligence Overview and Setup

Custom Vision Artificial Intelligence allows you to train an AI to perform certain actions.

In this instance, we will be training our Custom Vision AI to recognize the Timlin Enterprises logo. It will tag the images fed to the AI as being the logo or not being the logo. We can trigger the Custom Vision AI and feed it the image we are curious about in a Microsoft Flow.

To start, navigate to https://www.customvision.ai and sign in or create an account*. Once signed in, we will want to create a New Project. Simply click on “New Project” and fill out the prompt as shown.

*Note: The account you use should be linked to an Azure Subscription that it has access to. Also, in our example, our Resource Group was already created and set to the F0 Plan (Free Plan). During the original creation process of a new Resource Group in Custom Vision AI, it only allows you to create a new resource group under the S0 plan. If you want to avoid any costs, be sure to switch the plan for this resource group to F0 after creation (this will be shown later on in this post). In addition, the resource group must remain as the South Central US location as the functionality we are using is only available in this location currently.

Once you’ve created the project, you want to start off by creating two tags on the left-hand side. One tag can be called “Timlin Logo” and the other can be called “Not Timlin Logo” to keep everything simple. The “Not Timlin Logo” will be marked as negative as images that fall in this category are the opposite of what we are looking for in the logo.

In order to have enough data for your workflow, the AI will require a minimum of 5 images per tag you create. What you want to do is give the AI between 5 and 10 images that either is or look like the Timlin Logo and 5-10 images that look nothing like it. An easy way to do something like this would be to take the logo and slice it into different quadrants (as shown below). Upload each of these spliced sections as separate images into the AI and assign them the “Timlin Logo” tag.

Spliced Images

Spliced Images in Custom Vision AI

Tags

Next, we will want to feed it the Negative images. You can use any image that does not resemble the logo. In this example, I’ve used some stock Office logo images (as shown below).

Training Our Custom Vision Intelligence

Now that you have your sample data, we can start to Train our AI. To start this process, navigate to the “Train” button at the top left-hand corner.

You are then given two options: Fast Training and Advanced Training. What you are looking for is rather unique and easily identifiable, so you can simply use the “Fast Training” process.

If you were training an AI on hundreds or thousands of images to identify a very particular object/set of objects, you may look into using the Advanced Training process which will train the AI over a specified period of time. For now, select “Fast Training”, navigate to “Train” and wait for the training process to complete. If all went well, you should get your two tags with (most likely) a 100% precision rate in identifying whether an image is the logo or not.

You can test this with other images that aren’t in your sample pool as well. If you click on “Quick Test” you can feed the AI a separate image and ensure that the proper tag is being pulled back. For example, if I feed it the last screenshot referenced in this post, I should receive “Not Timlin Logo” back (as seen below).



Triggering Custom Vision Intelligence from Microsoft Flow

Triggering your Custom Vision AI from Flow and feeding it an image to test is very straight forward.

You can start by creating a new Instant Flow (one we can trigger manually). In this example, you will use a file stored in a library in our SharePoint tenant.

To do so, you first get the file content by using the “Get file content using path” function in Flow.

Now that you have access to the file and its content, you can add the action to determine the tags.

Search for and add the “Predict tags from image” action in Flow.  You first need to configure the Custom Vision connection. Give the connection a name and your prediction key (found on the settings page of your Custom Vision application). Once that has been configured, the action itself will request your Project ID, which can be found from the Settings page of your Custom Vision AI project, the Image Content (which is the File Content stored in the last action) and you can also input an Iteration ID if no default is set.

To find the Iteration ID, you can access the “Performance” tab within Custom Vision AI, click on an iteration and copy the “Iteration id” value into the Flow Action.

The completed Flow action will look like the one below.

When run, the Flow should trigger the Custom Vision AI and give you a JSON output that you can use for any future conditions with probabilities assigned to them.

With this, you will have completed your first Custom Vision Intelligence setup with Microsoft Flow integration.

If you have any questions on the setup, or anything else related to Microsoft Flow, let us know. Our team of Microsoft experts is available to help you!

6 Ways To Transform Construction Operations With Office 365

construction operations with office 365

Timlin’s focus is on helping customers with their digital transformation journey.  Although we have worked with clients in several industries, our work in the Construction industry has illustrated some notable opportunities to transform business operations using the tools in the Office 365 platform.

The Construction Lag

The construction industry currently lags other industries in the implementation of technology.  Technology advancements could boost productivity and operational efficiency through more effective communications, collaboration, business process automation, business analytics, and project management. Employees could be more empowered, optimize operations, and identify and address inefficiency.

It’s time for construction organizations to embrace technology, else they may find their peers gaining a competitive advantage. Consider that there are over 273,000 open jobs in the construction industry.  Also, consider the average age of a construction employee is 43, and one-fifth of the workforce is over 55.

As companies replace their workforce with younger staff, they are going to find employees with expectations of higher-level digital tools and capabilities.

Opportunities to Achieve Digital Transformation in the Construction Industry

In our experience, we’ve found several areas where the tools in Office 365 can transform construction Operations:

In our experience, we’ve found several areas where the tools in Office 365 can transform construction Operations:

1.) Enable employees to access critical job-related information, regardless of location or device

Bandwidth can be an issue at construction sites, but a well thought out approach allows on-site and remote employees to access to information they need quickly and easily on a phone or tablet. This information typically includes from CAD files, design specifications, contract information, and project plans and schedules.  The mobile applications available for Teams, SharePoint, or OneDrive all provide access to all the documents you may need to get the job done.

2.) Optimize – and even automate – business processes

Job delays are problematic and all too common in the industry. One of the biggest ways technology can transform operations is time savings from optimizing – and even automating – business processes. Employees in offices and in the field can move from manual processes, often using paper, to more effective document management, sharing, and security.

Such efficiencies can extend to supporting departments as well. HR can automate vacation tracking, employee reviews, or expense management. Legal can more effectively organize and secure documentation for faster claims response. And that’s just a start. All of this automation can be provided through the implementation of Microsoft Flow workflows and PowerApps forms, and they are available to the desktop, tablet or phone users.

3.) Improve internal communications

With a large number of remote workers, it is critical to bring the team together, communicate key news, and reinforce your company culture. As you move to a younger workforce, this type of communication is going to be assumed.

A SharePoint intranet makes it easy to access and share information and documents, regardless of where employees reside. It does not have to be too elaborate but it should be kept up to date with effective communication.  The result is an improved sense of community that allows you to enforce your brand and culture through electronic communications.

4.) Enable efficient and secure project management and collaboration

Microsoft Project Online and Teams are tools to enable efficient project management and collaboration. These tools allow your internal and contract resources to work together, and your IP is protected through proper security measures. The structured nature of Project Online also allows you to more effectively resource your projects, track your variances, and get projects done on time.

5.) Increase and improve business analytics to make better, data-driven decisions

Power BI is a powerful reporting tool that can be used to build dashboards that present the KPIs you need to track and act on to improve operations.  Dashboards that present your gross revenue, profit, and other project financials in real-time ensure you act on these metrics quickly and are not surprised by actual results.

6.) Eliminate time-wasting activities

Construction operations produce a lot of documentation. And, as mentioned, it’s often paper-based documents. With highly-organized document repositories, employees can access a document based on its content classification, if it’s specific to a project, architecture, engineering, supply chain, vendor, or another category.  With proper content classification, you can set up powerful search capabilities that eliminate wasting time looking for documents.

Timlin has successfully implemented the tools in Microsoft Office 365 for several construction companies. For example, read our case study about we helped The Middlesex Corporation achieve digital transformation with Office 365.

And contact us if you’d like to have a quick conversation to see how we can help you as well.


How Substring Works in Microsoft Flow

In most programming languages, we have access to an extremely helpful function called substring. We can use this function to extract snippets of a string that we need the most. Microsoft Flow is no different in the fact that it offers us the use of this function, but the way we use it is a bit different than typical modern approaches.

How Does Substring Work in Microsoft Flow?

In a programming language like JavaScript, we can use the substring functionality in various ways. Microsoft Flow, however, has one way of using it. When entering the substring expression into a Flow, you will see the inputs required to complete the function (shown below).

This looks relatively standard, but the one part of this that might throw you off is the length parameter. In normal conditions, the length of the string would simply be exactly that, the length of the entire string you are trying to pull a substring out of. However, in this case, if we wanted to subtract a file extension from the name of a file, it would not work.

Let’s take MyWordDocument.docx as an example.

We want to extract “docx” from the string. Typically, we could use the string as the “text” parameter (which would be MyWordDocument.docx), start index would be the index in which the period is, and length would be the entire length of the string. If we do this, we will encounter an error saying that the length parameter we used cannot be longer than the length of the string itself.

There’s a reason this happens — the length is assumed from the starting index. If we had a starting index of 5 and the length of the string was 8, we would come out with a length of 13 instead of the original length of 8, resulting in the error.

How to Use Substring without Error

In order to remedy this problem, we need to do a couple of extra steps. Let’s initialize four parameters in our Microsoft Flow: FileExtension, IndexOfPeriod, LengthOfString, and NewLength as shown below.

First, we will get the index of the period in the string using the following expression:

indexOf(“MyWordDocument.docx”,‘.’)

We can store this in the “IndexOfPeriod” variable by using the “Set Variable” command in Flow. Next, we can store the entire length of the string in our “LengthOfString” variable using the following expression:

length(“MyWordDocument.docx”)

Now, in order to get the actual length, we are going to put into the substring function, we need to subtract the full length of the string from the index in which the period sits. We can store this in the “NewLength” variable and process this using the following expression:

sub(variables(‘LengthOfString’),variables(‘IndexOfPeriod’))

Finally, we can use the substring function to store the final output (which should just be “docx”) in our “FileExtension” variable using the following expression:

substring(“MyWordDocument.docx”,variables(‘IndexOfPeriod’),variables(‘NewLength’))

With that, we should now see that the FileExtension variable now stores the output “docx”, which is exactly what we were looking for.  Below is a screenshot of how the Set Variable commands look within Microsoft Flow.

If you have any questions about Microsoft Flow, please reach out to our team here!

Read Next: Thrive’s Shift from SharePoint Designer to Flow & PowerApps

Managing SharePoint Item-Level Permissions with Microsoft Flow

 

Timlin had the opportunity to create a knowledge management system for a client using SharePoint Online. One of the requirements was to use item level-permissions to control access based on metadata. We used Microsoft Flow to satisfy this requirement and provide the client with a low-maintenance process along with tracking and visibility.

The Setup

To get started, a Document Library entitled Articles was created.  The library contained metadata to indicate Visibility, as well as a lookup to a Department’s list. The Department’s list was a list of each of the company’s departments, with an additional Text column to store the AD group associated with that Department.

The rules to be implemented were: If the Visibility is Public, grant everyone access to see the document.  If the Visibility is Confidential, only members of the tagged Departments would have access to view the document.   In order to prevent any visibility into Confidential documents, the initial permission for all items did not include the Visitors group.

The Flow

To begin, a Flow was created using the “When a file is created or modified (properties only)” trigger, specifying the Articles library as the source Library Name.

 

Two Integer variables were initialized to hold values needed later.

 

 

In order to grant access to Visitors, we need to retrieve the principal for the site group.  This is accomplished using the “Send an HTTP request to SharePoint”.  This Action allows you to send a fully authenticated request to the SharePoint APIs.

 

In this case, we use the SiteGroups/GetByName call to get the Visitors group.  We then store the value in the variable we stored.  Based on the way the JSON data is returned, we want to set the variable to the value in [‘d’][‘id’].

Next, we use the same component to break inheritance using the BreakRoleInheritance POST request on the item, using the ID from the original trigger.

 

We’ll use the Get Items action to retrieve all the values for the Department’s list. We’ll use this later to ensure we clean up properly.

To get started on setting permissions, we’ll use the Switch control to determine what action to take based on the Visibility column. For Public visibility, we’ll grant access to the Visitors group using the same HTTP Request action from before.

 

We’ll use the RoleAssignments/AddRoleAssignment method on the item to add the Visitors group we looked up earlier.

Note: to get the RoleDefId for the permission level you want to set, you can use the /_api/web/RoleDefinitions endpoint to retrieve all roles for the site collection.  Look for the Name element of the desired Permission Level and find the corresponding Id value.  In this case, we use 1073741826 which is associated with the Read permission level.

We have much more work to do for Confidential documents.

First, we want to remove any existing Department Group assignments.  We’ll use an Apply To Each action to iterate over the Department’s list we retrieved earlier.  We need to get the Principal for that group, similar to how we retrieved the Visitors group, and use the RoleAssignments/RemoveRoleAssignment method to remove any permissions for that group/permission level.

 

Once removed, we’ll again iterate, this time over the departments associated with the Article.  We’ll retrieve the full Department item using the Get Item action, so we can find the corresponding AD Group associated with the Department.  We’ll store that value and once again use the RoleAssignments/AddRoleAssignment method to grant Read permission to the item.

 

Upon execution, any items added or modified will have the appropriate permission level set based on the Visibility and Department values tagged by the author.