#ProjectOnline #PowerBI Report – Include #HTML formatting #PPM #PMOT #PowerQuery #OData #REST Part 3

January 16, 2018 at 8:18 pm | Posted in Configuration, Customisation, Functionality, Information, Reporting | 4 Comments
Tags: , , , , , , ,

Following on from my 2nd post in this mini series on reporting including HTML formatting in Power BI, in this post we will look at a couple more options that will refresh in the Power BI App Service. If you missed the previous posts, the links are below:

Part 1: https://pwmather.wordpress.com/2018/01/01/projectonline-powerbi-report-include-html-formatting-ppm-pmot-powerquery-odata-rest-part-1/

Part 2: https://pwmather.wordpress.com/2018/01/03/projectonline-powerbi-report-include-html-formatting-ppm-pmot-powerquery-odata-rest-part-2/

The options we will look at in this post require a process to get the Project HTML data into a source that can be queried from Power BI with one call. Firstly I will demonstrate a simple PowerShell script that will get the data and write this to a SharePoint list on the PWA site. This is process is very similar to a Project Online snapshot solution starter script I published back in August 2016: https://pwmather.wordpress.com/2016/08/26/projectonline-data-capture-snapshot-capability-with-powershell-sharepoint-office365-ppm-bi/ Once you have a script running to capture the data on the defined scheduled you will see something similar to the screen shot seen below:


Here you can see my process has run twice, once back in August when I first wrote this script and just now when I ran it again. As this is based on sales demo data, you can see in the two expanded examples the data has not changed but in a real world usage I’d like to think the data would have changed / been updated! Having the data in one list enables a SharePoint OData call from Power BI, as I have included the ProjectId in the data on the list, this can easily be joined with the data from the main Project OData Reporting API. As this data is in a SharePoint list you might need to consider the user permissions / access to the list. If this was running on a schedule, either from a Windows Scheduled task if on-prem or maybe a scheduled Azure Function if you wanted to make use of Azure PaaS, set the schedule to run before the reports were due allowing time for this process to complete. I won’t cover the PowerShell script in detail here as I will create a dedicated post for that in a week or two, but I will highlight the changes required if you were to start with the OData snapshot example.

  • The first API called was updated in this example to change the select query to just return the ProjectId:


  • After the while statement, the script will start a foreach loop and set the ProjectId to a variable:


  • Then the REST URL is constructed and the ProjectId is passed in. The select query includes the Project Name, Project ID and the Multiline custom fields that I want to include. I then make the various REST calls in a try / catch block, firstly to get the data:


  • Then to write the data to the SharePoint list:


Once that runs successfully with an account that has full access to all projects and edit access the the SharePoint list, your target list will contain all of the projects along with the selected fields. As mentioned, I will post this full script in a week or two once I get a chance to tidy a few bits up in the code sample but hopefully the screenshots of the changes along with the snapshot example PowerShell script, there will be enough pointers to get started. Now the data is in a single source, it is very simple to use in Power BI.

In Power BI Desktop add a new OData feed, in the URL field enter the SharePoint list REST URL for the source list, for example the REST URL I used is: https://tenant.sharepoint.com/sites/PWA/_api/web/lists/getbytitle(‘ProjectMutliLineFields’)/Items    where ProjectMutliLineFields is the name of my SharePoint list. Edit the query to launch the Power BI query editor. In this example, my source SharePoint list contains duplicate projects but in my report I want to only see the latest. The steps below will transform the data so that the report only has the latest version for each project record. Rename the query to IDandDate then remove all columns except for the ProjectId and Created columns:


Now group by ProjectId and get the Max Created value, I called this column “Latest”:


That will give you a list of unique Project IDs using the latest record. Now add a 2nd OData feed and use the same SharePoint list REST URL as in the previous step. Remove columns that are not required, I removed all expect for Title, ProjectId, Created and the multiline fields. Then rename the columns to meaningful names if required:


This query will currently contain the duplicate project records based on my example list, next I will merge this query with the IDandDate query using the ProjectId column and the Created/Latest column:


Hold down the Ctrl key to select more than one column per table for the merge.

This will add the new column into the table:


Click the double arrow on the column heading to expand the column then select the aggregate radio button. On the dropdown menu next to Latest select Maximum:


This will show a date value for the latest records, where a null is displayed, there is a duplicate record with a later date:


Filter out the null records from the Max of Latest date column and that is it. For the purpose of this blog post, I also added a 3rd query to the Project OData API to show data from the two sources. Close and Apply the data then ensure the relationships are correct, I also set the IDandDate query to be hidden in the report view:


Then design your report as needed making use of the same HTML Viewer custom visual:


As you can see, this is just a simple example like the others just to highlight the HTML formatting being rendered in Power BI.

Another option without having to write and maintain any custom code or write the data to a SharePoint list does make use of a 3rd party tool that extracts the Project Online data into an Azure SQL database as the data changes in Project Online. This particular tool is developed by the Product Dev team I lead at CPS and is called DataStore. This product is part of our edison365 product suite but is available on its own. This isn’t sales pitch so I won’t go into details here but I just wanted to give another option as some people prefer no code solutions. There are also other software vendors that do similar products for Project Online but I’m not sure if they include the multiline project level fields with the HTML. So using this tool or similar (check they include the HTML fields), you can get all of your Project Online data into an Azure SQL database, as mentioned, the DataStore tool will also include the HTML data as displayed below in the example SQL query below:


Power BI can get data from the Azure SQL Server and this data will also refresh in the Power BI App Service.

Feel free to contact me if you have any queries or questions but hopefully that gives you some ideas on including the HTML formatting in your Project Online reports using Power BI!

#ProjectOnline #PowerBI Report – Include #HTML formatting #PPM #PMOT #PowerQuery #OData #REST Part 2

January 3, 2018 at 11:10 pm | Posted in Configuration, Customisation, Functionality, Information, Reporting | 8 Comments
Tags: , , , , , , ,

Following on from my first post discussing including HTML formatting for Project Online Power BI Reports, in this post we will look at a summary of options to get the correct data into Power BI then walkthrough one of those options. In part 3, the final part, we will look at one of the other options to get the data.

For those of you that missed part 1, see the post here: https://pwmather.wordpress.com/2018/01/01/projectonline-powerbi-report-include-html-formatting-ppm-pmot-powerquery-odata-rest-part-1/

As per the first post, it is very simple to have the data rendered in Power BI to include the HTML formatting, the slightly more tricky part is to get the Project Online data into Power BI with the HTML included.

First, a bit of background on where in Project Online you can access the enterprise project multiline custom fields with the HTML included. As per the first post, you need to access the REST API ({PWAURL}/_api/ProjectServer) to get the data with the HTML included as the OData Reporting API ({PWAURL}/_api/ProjectData) has had the HTML removed. Using the REST API we can view the endpoints at the root:


This is the REST API to programmatically interact with the Project Online data, you can create, read, update and delete data using this API depending on your access. For this reporting post we only need to read data, carryout the steps with an account that has access to all projects in the PWA instance like an Admin account.

The endpoint we need is /Projects:


This will detail all of the projects the logged on user has access to – it is recommended to carry out these steps with an account that has access to all projects in the PWA instance otherwise you might / will see errors in later steps. For each project detailed you will see a few key project level properties including things like Name, Description, Created Date, ID to name a few. It is also possible to navigate from there using the Project ID to get more details for that project. For example you can get the project tasks using the following URL: ProjectServer/Projects(‘{ProjectGUID}’)/Tasks or get the project team using this URL: ProjectServer/Projects(‘{ProjectGUID}’)/ProjectResources. To get the enterprise project level multiline custom fields we need to use the following URL: ProjectServer/Projects(‘{ProjectGUID}’)/IncludeCustomFields. Accessing this URL for the specified Project GUID (replace the placeholder with an actual project GUID) you will see more properties for that project including the multiline custom fields we need:


Notice the HTML in the custom field outlined in red in the image above. You would need to do this call for all projects but using the correct Project GUIDs. Also worth pointing out, in this API the custom fields are referenced using the internal names, for example Custom_x005f_4d0daaaba6ade21193f900155d153dd4 rather than the display names. You can use the custom fields endpoint to map the internal names to the display names: /ProjectServer/CustomFields.

So that covers the background on how you access the multiline custom field data that includes the HTML using the REST API, next we look at how to do this from Power BI. What makes it slightly more tricky than just using the normal OData Reporting API is that you have to make a call dynamically for each project GUID if you are using the REST API directly. In this series of posts we will look at calling this API dynamically straight from Power BI (covered later on in this post) but that has a limitation and also another method to get this data from one call / endpoint but that requires a bit of custom code / a 3rd party tool but does remove the limitation / issue. I will cover off the latter option in the 3rd blog post including a code sample / snippet.

Moving on to Power BI and getting this data dynamically and explaining the limitation. This process with follow the same approach I documented a while back to report on project site data using the the SharePoint list REST API: https://pwmather.wordpress.com/2016/01/05/want-to-query-cross-project-site-sharepoint-lists-in-projectonline-projectserver-powerbi-powerquery-bi-office365-excel-ppm/ As per the post above, this will require a custom function and a custom column to call the function. The limitation of this approach is that it works fine in the Power BI Desktop client but the data will not currently refresh in the Power BI App service. There might be workarounds to this limitation but that is beyond the scope of this blog post.

Firstly get a REST URL for one project that includes custom fields, for example I have used this: https://tenant.sharepoint.com/sites/pwa/_api/ProjectServer/Projects(‘ad641588-f34b-e511-89e3-00059a3c7a00‘)/IncludeCustomFields?$Select=Id,Name,Custom_x005f_4d0daaaba6ade21193f900155d153dd4 – replace the parts highlighted in yellow with details from your PWA instance. In this example I have included just one of my multiline custom fields but include as many multiline fields as required, just separate them using a comma. As mentioned before, use the /CustomFields endpoint to identify the correct custom fields to include in the select statement. You can see below, the example multiline field I have used is called “Status Summary”


Now add this URL as a data source in Power BI using the Get Data > OData feed option. That will open the Query Editor and show the record:


Update the Query Name to something like projectHTMLCFsFunction as this query will be turned into a function. In the Query Editor, on the View tab access the Advanced Editor and you will see your query:


The full query will be similar to this:

    Source = OData.Feed("https://tenant.sharepoint.com/sites/pwa/_api/ProjectServer/Projects(‘ad641588-f34b-e511-89e3-00059a3c7a00’)/IncludeCustomFields?$Select=Id,Name,Custom_x005f_4d0daaaba6ade21193f900155d153dd4")

This needs to be modified to turn this into a parameterised function like below, parts highlighted in yellow are added / edited:

let loadHTMLCFs = (GUID as text) =>
        Source = OData.Feed("https://tenant.sharepoint.com/sites/pwa/_api/ProjectServer/Projects(‘"&GUID&"‘)/IncludeCustomFields?$Select=Id,Name,Custom_x005f_4d0daaaba6ade21193f900155d153dd4")
in  loadHTMLCFs

A screen shot below to show the completed query in the Query Editor as the formatting is clearer, bits added / edited are outlined in red:


Click Done in the Query Editor and you will see the following:


No need to do anything with the parameter or buttons. Now we need to add another data source in for the other data feeds required in the report, for the purpose of this blog I will just add in the minimum required and that is the default OData Reporting API /Projects endpoint to get the other project fields into the report. In the Query Editor on the Home tab click New Source > OData feed and add in the OData Reporting API URL: https://tenant.sharepoint.com/sites/pwa/_api/ProjectData then select the tables required. For this blog post I have just selected Projects. Using the Query Editor, remove unwanted columns, rename columns etc. You will need to keep at least ProjectId and ProjectType, they are required. For the purpose of the blog post I have just selected ProjectId, ProjectType, ProjectName and ProjectOwnerName. Using ProjectType, filter out ProjectType 7 as this is the Timesheet Project record. Keeping this in the dataset will cause errors later on.

Once you have edited the query as required a new custom column needs to be added to invoke the function created earlier. Click the Add Column tab then click Custom Column. Give the column a name such as GetProjectHTMLCFs then enter the following: projectHTMLCFsFunction([ProjectId]) as seen below:


projectHTMLCFsFunction is the name of the function we created earlier and we are passing in the ProjectId. When clicking OK, this might take a while depending on how many projects you have as this will invoke the function for each project and call the REST API, passing in the ProjectId for that row and bring back the records. Once completed you will see the records as below in the new custom column:


Now the column needs to be expanded, click the double arrow in the custom column heading and expand the multiline custom fields, in this example I just have one:


Click OK and the data will refresh / load then display the data for the multiline columns:


Notice we have the HTML in the data! Rename the columns for the correct display names then when completed, click Close & Apply. The changes will now be applied to the Power BI Report and load the data. Add in the HTML Viewer custom visual as detailed in blog post 1 then add the data on the the report canvas as you would normally. Ensure that the multiline custom fields use the HTML Viewer custom visual:


An example with a normal table visual and the HTML Viewer visual:


That’s it, design your Project Status reports to now include the HTML formatting your users have added. Just remember though, this will only refresh in the Power BI Desktop client. It can be published to the Power BI App service but the data will be static and will not update, you would need to open the report in the Power BI Desktop client, refresh it then publish it back into the Power BI service.

Next up in part 3 we will look at a slightly different approach to get the data in Power BI that does enable the report / data to refresh in the Power BI App service.

#ProjectOnline #PowerBI Report – Include #HTML formatting #PPM #PMOT #PowerQuery #OData #REST Part 1

January 1, 2018 at 9:40 am | Posted in Configuration, Customisation, Functionality, Information, Reporting | 3 Comments
Tags: , , , , , , ,

My first post for 2018, Happy New Year to all! This post is the first of 2 or 3 posts covering HTML formatting in your Power BI reports from Project Online multiline project level custom fields as seen below – screenshot from mock up / demo data:


For those of you that are familiar with the Project Online Reporting API, Microsoft made a change back in May 2016 to remove the HTML from the OData API ({PWAURL}/_api/ProjectData): https://pwmather.wordpress.com/2016/05/30/projectonline-odata-reporting-api-updated-to-remove-html-tags-office365-bi-excel-powerbi/. This was due to requests from customers so that Excel / Power BI reports could contain cleansed data without having to remove the HTML from the strings yourself. As mentioned in the blog post above, the HTML strings for multiline project custom fields are still available from the REST API ({PWAURL}/_api/ProjectServer).

Back in November 2017 a new custom Power BI visual was released to render HTML: https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-november-2017-feature-summary/#HTMLViewer, this now means that you can include the nicely formatted text from Project Online multiline project level custom fields in your Power BI reports. A couple of screen shots below show what your project custom field multiline data probably looks like today in your reports and what it could look like. Ignore the very basic dull looking report, this is purely just to demo the HTML rendering.

Without the HTML formatting from the OData API – it is just a block of text:


With the HTML formatting – it is nicely formatted and readable:


This matches the text on the Project Detail Page (PDP) in the Project Web App for that example demo project:


To be able to include the HTML formatting there are two parts:

  • Get the data that includes the HTML
  • Add the HTML Viewer custom visual to your Power BI Desktop client

The latter being very simple from the Power BI Desktop client by either clicking the ellipsis in the Visualizations pane:


Or using the button on the Home ribbon:


Then search for the HTML viewer and add it:


In the next 1 or 2 posts I will cover some different options for getting access to the data that includes the HTML.

#Microsoft #Ignite Day 2 summary #Azure #Office365 #MSIgnite #Microsoft365

October 5, 2017 at 6:39 pm | Posted in Functionality, Information | Comments Off on #Microsoft #Ignite Day 2 summary #Azure #Office365 #MSIgnite #Microsoft365
Tags: , , ,

******************** This is a guest blog post by Lee Mather ********************

Day Two

After recharging the batteries, we had another early start for day and had as many sessions packed in as possible. I will try to provide a summary of key takeaways for each session. Please let me know if you would like any further detail as I’m happy to discuss.

Session 1: Overview: Modern Windows 10 and Office ProPlus management with Enterprise Mobility + Security

This session focused on moving to a modern management for Windows and went into more detail than the session I attended on day 1. The main highlights were:

· Simplify deployment and device management with Intune and Windows Autopilot. Microsoft’s vision is for laptops and PCs to be as easy to setup as mobile phones. Microsoft 365 powered devices will benefit from the following

o Intelligent Security

o Easy deployment

o Always up to date

o Proactive insights with Windows Analytics


· A couple of slides which highlight the difference between traditional IT and modern IT



· There are a few paths to transition to modern management which has been expanded by the introduction of co-management. Co-management should be released by the end of the 2017 calendar year


Session 2: Simplify hybrid cloud protection with Microsoft Azure Security Centre

The second session of the day focused on hybrid cloud protection with Microsoft Azure Security Centre. As I mentioned this in the summary of day one, I will only highlight the features.

· Hybrid Cloud Support – This allows a unified security centre for Azure workloads as well as workloads running on-premises and in other clouds. Additional information can be found here https://azure.microsoft.com/en-us/blog/azure-security-center-extends-advanced-threat-protection-to-hybrid-cloud-workloads/

· Just In Time Access – This allows administrators to lock down ports such as RDP and to allow access when required. Additional information can be found here https://docs.microsoft.com/en-us/azure/security-center/security-center-just-in-time

· Adaptive Application Controls – This allows you to define a set of applications which can run on your VMs which helps in the fight against malware. Additional information can be found here https://docs.microsoft.com/en-us/azure/security-center/security-center-adaptive-application

· Azure Security Centre will soon include Windows Defender ATP detections. Additional information can be found here https://azure.microsoft.com/en-us/blog/azure-security-center-extends-advanced-threat-protection-to-hybrid-cloud-workloads/

· An interactive threat intelligence map has been added to the Azure Security Centre for visualisations. Additional information can be found here https://docs.microsoft.com/en-us/azure/security-center/security-center-threat-intel

· An investigation dashboard is currently in preview which will correlate all relevant data with any involved entities. You will be able to navigate between entities by clicking through the graph and providing information. Additional information is available here https://docs.microsoft.com/en-us/azure/security-center/security-center-investigation

Session 3: What’s new with Microsoft Exchange Online Public Folders

This was a 30-minute session to highlight some of the new features in Exchange Online Public folders. The main takeaway for me was the migration of public folders to Office 365 Groups. This migration approach is supported for Exchange 2010, 2013, 2016 and Exchange Online. Additional information can be found here https://blogs.technet.microsoft.com/exchange/2017/09/25/migrate-your-public-folders-to-office-365-groups/

Session 4: Office 365 Security and Compliance Overview

The session focused on the following key areas of Office 365 security:

· Threat Protection

· Information Protection

· Security Management

· Compliance Management


The main takeaways and areas of interest for this session were:

· ATP has expanded to SharePoint, OneDrive for Business and Microsoft Teams



· Safe Links will now show the original URL when a user hovers over the link

· Safe Links will now apply to internal and external email

· Safe Attachments now have a preview feature enabled to allow a user to view a document online whilst it’s being scanned. Scanning time has also been greatly reduced over the last year

· SharePoint malware will be reported in the Threat Management portal

· New updates being rolled out to Office 365 Threat Intelligence

o Threat Tracker

o Threat Explorer

o Remediation capabilities

o Attack Simulator – Admins will have the ability to simulate different threat scenarios to gain an understanding of how users behave in the event of a real attack



· Single interface for creating protection labels is on the way


· New reporting capabilities being rolled out to the Office 365 Security and Compliance Centre

· New Compliance Manager should be in Public preview later this year. Additional information can be found here https://techcommunity.microsoft.com/t5/Security-Privacy-and-Compliance/Manage-Your-Compliance-from-One-Place-Announcing-Compliance/ba-p/106493


Additional information can be found here: https://techcommunity.microsoft.com/t5/Security-Privacy-and-Compliance/Bringing-deeper-integration-and-new-capabilities-to-Office-365/ba-p/109409

Session 5: Learn about the Microsoft global network and best practices for optimizing Office 365 connectivity

This was a great session that looked at the Microsoft network and as the title suggests, provided best practise tips. The main takeaways for the sessions were as follows:

· Avoid proxy servers when connecting to Office 365

· Review Office 365 IPs regularly and update firewalls if required

· Ensure your traffic is routing to the Microsoft network in as few hops as possible


· Network performance requirements for Skype for Business and how to test your network using the Skype for Business tool – https://www.microsoft.com/en-us/download/details.aspx?id=53885



· Skype for Business is the most sensitive to network performance and requires additional ports for best user experience


· Some interesting facts regarding the Microsoft network


Session 6: Yammer’s roadmap for enhanced integration, security and compliance

This was a breakout session that only last 45 minutes. The main takeaways were:

· Local data residency preview Q4 2017

o Existing customer H2 2018

· GDPR compliance by Q1 2018

· Yammer will soon be added to eDiscovery in the Office 365 Security and Compliance Centre

· DLP policy tips will be rolling out to Yammer in 2018

· SharePoint Group integration


That concluded another busy day at Ignite!

#Microsoft #Ignite Day 1 summary #Azure #Office365 #MSIgnite #Microsoft365

October 5, 2017 at 6:18 pm | Posted in Functionality, Information | Comments Off on #Microsoft #Ignite Day 1 summary #Azure #Office365 #MSIgnite #Microsoft365
Tags: , , ,

******************** This is a guest blog post by Lee Mather ********************

So, after an eventful week in Orlando at Microsoft Ignite I thought I would share with you a summary of my first day! With over 26,000 attendees it was always going to be a busy event with lots going on, we would need a team here to take in all of the information being presented to us. I am hoping to have a chance to share the other days ASAP.

Day One

During the first day I attended several sessions and would like to share the following takeaways.


The mission statement from Microsoft is to “Empower every person and every organisation to achieve more


The keynote focused on digital transformation which is aligned to the four key areas below:

· Modern Workplace

· Business Applications

· Application & Infrastructure

· Data & AI


Following on from the keynote session, I had a day packed of sessions mainly focusing on creating a modern workplace.

Session 1: Create a modern workplace with Microsoft 365

The session focused on the following key areas:

· Flexibility to work on any device, anywhere

· Improvements to Office application including PowerPoint Designer using AI.

· Teamwork

o Microsoft Teams will be the hub for teamwork including chat, calls and meetings.

o Microsoft Teams will replace the Skype for Business client over the next year

Other worthwhile mentions from the session included Cortana Calendars. Cortana Calendars is a new feature that is currently in “Exclusive” preview but essentially this service will help intelligently schedule meetings on your behalf, simply by adding Cortana in the CC field. The service will support Office 365, Outlook.com or Google calendar. Additional information can be found here https://calendar.help/

“With the speed of artificial intelligence and the personal touch of a human assistant, Calendar.help takes care of business.”

The session also focused on the modern device management and deployment. Windows Autopilot will allow new device deployment in a few simple steps, straight from the vendor to your desk without involving IT. Working with Windows Autopilot and a device management solution such as Intune, the device will be deployed, Azure AD joined or Offline domain join to Active Directory, configured with all policies defined by IT and app deployment. More on this over the coming days.

Device Co-Management was also announced as a new feature, this is the “Hybrid” device management that will support a transition from System Centre Configuration Manager (SCCM) to Intune. Co-Management will allow a device to be managed by both SCCM and Intune allowing a gradual migration to modern management whilst supporting legacy devices and applications.


Finally, a couple of new offerings from Microsoft, Microsoft 365 Education and Microsoft 365 F1. Additional details can be found here https://products.office.com/en-gb/business/office-365-f1 and https://products.office.com/en-gb/academic/compare-office-365-education-plans

Session 2: Use MDM migration tools to accelerate move from GPO to MDM

This session focused on accelerating a move to MDM and moving away from GPO to MDM policies. GPO has been around for years with many legacy policies in the workplace that may or may not be required today. Microsoft are continually releasing new device configuration policies to Intune. To support the migration away from GPO to MDM, the following PowerShell script can be executed that will output a report. https://github.com/WindowsDeviceManagement/MMAT

The tool is being regularly updated to align with new features available in Intune.


Session 3: Cloud infrastructure: Enabling new possibilities together

The Cloud Infrastructure session focused on using Azure for all workloads, big and small.


The session explained how you can setup a SAP HANA platform in minutes using Azure instead of weeks using on-premises hardware. Microsoft are continually providing more options for scaling in Azure, currently offering up to 20TB RAM for large SAP HANA deployments.


It was also recommended to use the DevTest labs to reduce costs for non-production workloads. For more information on DevTest lab please take a look here https://azure.microsoft.com/en-us/services/devtest-lab/

The Azure Security Centre is now Hybrid too, allowing for non-Azure machines to be onboarded to take full advantage of the security monitoring and recommendations built into Azure.


This is a great additional to Azure Security Centre for a single unified view of your servers, either on-premises or running in Azure.

Currently Azure have 42 regions with 100’s of data centres, 4500+ peering locations and 130+ edge sites making the Azure network one of the largest in the world.


Microsoft, in conjunction with Facebook, also completed the first transatlantic cable which is approx. 4000 miles long and can support 160TB/sec of data.


Other worthy mentions would include:

· Additional Express Route partners


· Azure Distributed Denial of Service (DDos) Protection has just been released to preview. Additional information on the DDos service from Microsoft can be found here https://azure.microsoft.com/en-us/blog/azure-ddos-protection-service-preview/


· Azure Data Box has been released in preview and would be used to support customer with large data imports to Azure. The device supports 100 TB and a customer can have multiple devices. Additional information can be found here https://azure.microsoft.com/en-us/blog/announcing-the-preview-for-the-azure-data-box-achievements-will-be-unlocked/


· Azure File Sync is another new preview release that allows Azure File Share to sync with on-premises file servers. Cloud tiering is implemented that allows storing a certain amount of data locally and store a larger amount of data in the cloud. Azure File Sync is also integrated into Azure Backup. Additional information can be found here https://azure.microsoft.com/en-us/blog/announcing-the-public-preview-for-azure-file-sync/


I could go on all day about updates from Ignite so will provide links to everything else announced during the session!

· Azure Policy – https://azure.microsoft.com/en-us/services/azure-policy/

· Azure Migrate – https://azure.microsoft.com/en-us/blog/announcing-azure-migrate/

· FastTrack for Azure – https://azure.microsoft.com/en-us/programs/azure-fasttrack/

· Cloudyn – (Cost Management for Azure) https://blogs.microsoft.com/blog/2017/06/29/microsofts-acquisition-cloudyn-will-help-azure-customers-manage-optimize-cloud-usage/

Session 4: What’s new and upcoming in AD FS to securely sign-in your users to Office 365 and other applications

The ADFS session focused on what’s new and what’s coming in the future for ADFS for Office 365. As this was a shorter session I would highlight the key takeaways only.

· Passwords could soon be a thing of the past. Password less options for ADFS 2016


· How to get to password less authentication


· Extranet lockout for ADFS with known locations is coming soon. This feature will help user accounts from being locked out from failed external logon attempts.


· Resolve ADFS authentication issues faster using the help site – https://adfshelp.microsoft.com/

· ADFS Rapid Restore tool – https://docs.microsoft.com/en-us/windows-server/identity/ad-fs/operations/ad-fs-rapid-restore-tool

That was enough for one day and it was time to leave the convention centre and head back to the hotel.

Link #ProjectOnline tasks to #Planner #Microsoft365 #PPM #PMOT #MSProject

August 29, 2017 at 4:04 pm | Posted in Administration, Functionality, Information | Comments Off on Link #ProjectOnline tasks to #Planner #Microsoft365 #PPM #PMOT #MSProject
Tags: , , , , ,

In the latest release of Project Online Desktop Client, depending on the release channel that you are on, you might have noticed the new Planner button in the Task ribbon:


My Office version is on the Office Insider Fast channel so that I get the latest changes first but this will typically be controlled by your IT admins so you might need to wait a while until this feature reaches the Office release channel that you are on.

Hovering over the new button provides the details for this feature:


This feature allows you to create a hyperlink to the linked Planner Plan from the Planner icon in the indicators column. It could be that you have a bucket type task (sprint etc.) in the Project Online project then the detail tasks / activities might be in the linked Planner plan.

Select a task in the project then click the Planner button and a side pane will launch:


Click the “Link to existing Planner plan..” link then you will be able to type the name of the Office 365 Group that contains the Planner plan:


Start typing name of the group then select the correct group, in this example I have one called Pauls Test Plan:


Notice how it also states that it will add the resources assigned to the task into the group. Click the Create Link button:


It then shows that this task is now linked to that Planner plan:


You then get the hyperlink directly to the linked Planner Plan using the Planner icon in the indicators column or using the link in the Link to plan pane.

You can only link one task from the project plan to one Planner plan, if you try and link another task to the same Planner plan you will see this alert:


The resources added to the linked task didn’t get added to the group as suggested but as this feature is only in the Office Insider builds that might come when this is released in the other release channels. ***Update – this feature does work providing the PWA Resource email address matches the O365 user principle name. I believe other options are being explored***

For release details, see the article here: https://support.office.com/en-us/article/What-s-new-in-Project-2016-111bcaf9-bc27-4c15-80e6-85e726307520?ui=en-US&rs=en-US&ad=US#Audience=Office_Insiders

Blog at WordPress.com.
Entries and comments feeds.