Installing a Custom Theme in SharePoint Online with PowerShell

Steps: 

  1. Download and Install SharePoint Online Management Shell 
  2. Create a custom theme using the Theme Generator Tool
    • Navigate to this URL to generate your new theme 
    • You can use the Colour section to choose your custom Colours that match your brand. You will get to see the sample page render in real-time, showing you what the page will look like with new Colours.Img1 2
    • Next, click Export theme in the upper-right-hand-corner. On the next screen, choose PowerShell tab, then copy all of the text into a Notepad.

      Img2 2

  3. Start SharePoint Online Management Shell: 
    • After installation, click on the SharePoint Online Management Shell to open it.

      Img3 2

  4. Connect to SharePoint Online: 
    • Paste the following command in the promptConnect-SPOService -Url https://domain-admin.sharepoint.com
      (where domain is your SharePoint domain name)
    • Press Enter and provide your SharePoint admin credentials when prompted

      Img4

  5. Paste the Theme Script: 
    • At the command prompt, type in $themepalette = then paste the code you exported. It should look like the image below. Press Enter.
    • Next, type the following command:
      Add-SPOTheme -Identity “Name of your theme” -Palette $themepalette -IsInverted $false

      **Where Name of your theme is a theme name you want to give/assign to the theme. **

      Img5 1

    • We can now change to this new theme in SharePoint.
  6. Verify Installation: 
    • The custom theme is now installed in your SharePoint Online tenant.
    • Navigate to any SharePoint site and access the theming options:
      • Click the gear icon (⚙️) in the top right corner. Select “Change the look.”

        Img6 1

      • You should see the custom theme listed there. (From your organization section)

        Img7 1 1

    • This process only installs the theme. You will need to manually apply it to individual sites.

Efficient Techniques for SharePoint Co-Authoring

What is Co-Authoring in SharePoint Online? 

Co-authoring in SharePoint Online enables multiple users to simultaneously edit and collaborate on documents in real-time. This feature is activated by default in SharePoint Online document libraries. When several users work on a document (like Word, Excel, or PowerPoint) at the same time, everyone’s changes are immediately visible to all participants. Essentially, co-authoring allows multiple people to work on a document concurrently.  

In this blog, we have included valuable tips outlining the essential dos, don’ts, and advanced strategies for effective co-authoring in SharePoint. 

Do’s 

  1. Use Modern Browsers: Always use the latest version of modern browsers like Microsoft Edge, Google Chrome, or Firefox for the best co-authoring experience.
  2. Enable Autosave: Keep autosave enabled to ensure that all changes are saved in real-time.Img1 1 
  3. Keep laptop online: Ensure your laptop has a stable internet connection to avoid losing changes and to keep the document updated in real-time. 
  4. Start from scratch: new document, new template: Begin with a new document and template to avoid any formatting issues from copied content. 
  5. Save Regularly: Although SharePoint Online automatically saves your work, it’s a good practice to save your changes frequently. 
  6. Check Permissions: Make sure that all users who need to co-author have the appropriate permissions to edit the document. 
  7. Enable Version History: Ensure version history is enabled in your document library to track changes and revert to previous versions if needed. 

    Img2 1 

  8. Use Comments: Utilize the comments feature to communicate with your co-authors directly within the document.Img3 1 
  9. Avoid Opening the Same Document in Multiple Browser Tabs or Applications: Opening the same document in multiple tabs or different applications can cause conflicts and versioning issues. Work on the document in a single instance to prevent such problems. After finishing your work, remember to close the document to free up resources and reduce the risk of conflicts

Don’ts 

  1. Avoid Editing Offline: Refrain from editing documents offline if multiple users are working on the same document simultaneously, as this can cause version conflicts. 
  2. Do not copy and paste directly: Refrain from editing documents offline if multiple users are working on the same document simultaneously, as this can cause version conflicts. 
  3. Do not leave the document open when not editing it: Close the document when you are not actively editing to reduce the risk of conflicts. 
  4. On large documents, do not have more than 5 co-authors: Limiting the number of co-authors on large documents helps prevent conflicts and improves performance. 
  5. If possible, do not use track changes: Track changes can slow down the document and cause confusion. Use comments instead for feedback. 
  6. Don’t Use Unsupported File Formats: Stick to supported file formats like Word, Excel, and PowerPoint for co-authoring. Avoid using formats that do not support co-authoring features. 
  7. Avoid Large Files: Try not to co-author very large files, as this can slow down the process and lead to performance issues. 
  8. Don’t Overwrite Others’ Work: Be mindful of other users’ changes and avoid overwriting their work. Use the document’s history to see who made which changes. 

Advanced Tips 

  1. Implement Document Libraries: Organize documents within SharePoint libraries for better management and easier navigation. 
  2. Communicate with Co-authors: Maintain clear communication with your co-authors to avoid conflicting changes and ensure everyone is on the same page. 
  3. Assign Roles and Responsibilities: Clearly define roles and responsibilities for each co-author to streamline the collaboration process. 
  4. Monitor Document Activity: Use SharePoint’s activity monitoring features to keep track of who is editing the document and when.Img4 1 
  5. Use the “Check In” and “Check Out” Features: These features can help manage editing rights and reduce conflicts, especially on larger documents. 
  6. Ensure Consistent Office Versions: Ensure all co-authors are using the same or compatible versions of Microsoft Office to avoid compatibility issues. 
  7. Maintain Consistent Naming Conventions: Establish and follow consistent naming conventions for documents to improve organization and retrieval. 

Conclusion  

In conclusion, the co-authoring feature in SharePoint Online is a powerful feature for teams who are working together on a project or document. With the ability to collaborate in real-time and access to a range of other collaboration and productivity features, SharePoint Online makes it easier for teams to work together and achieve their goals. Whether you are working in the office or remotely, SharePoint Online co-authoring can help you to be more productive and efficient. 

By following these tips, you can maximize the efficiency and effectiveness of collaborative work on SharePoint, ensuring a smoother and more productive co-authoring experience. 

Essential Guide: Applying Retention Labels in SharePoint

Retention labels in SharePoint are a powerful tool to streamline document management, ensure regulatory compliance, and optimize storage by automating the retention or deletion of content based on predefined settings. 

  • Automate document lifecycles: Set retention rules to automatically keep important files for the right amount of time, ensuring compliance and protecting sensitive information. 
  • Declutter your library: Automatically delete outdated documents, freeing up valuable storage space for what matters most. 
  • Save time and effort: Effortlessly manage documents with pre-defined retention settings that take care of the details for you. 

This guide equips you with the practical know-how of using retention labels in SharePointWe’ll show you how to apply labels to individual files and entire libraries, so you can take control and streamline your document management process. 

Apply Retention Labels to Your SharePoint Items 

  1. Select the document/folder for which you want to apply the retention label.Img1
  2. In the upper-right corner, select Open the details pane.

    Img2
  3. Under Apply label, select Choose a label to open the list of options.

    Img3
  4. In the Properties section, under Apply Label, click to open the list of options.   
  5. Select the appropriate retention label for your document. (To learn about the differences between the labels, you can point at each one to see a description of it and its retention period.)

    Img4
  6. Click “Save” to apply the label.
    Img5

Set a Default Retention Label for SharePoint Document Library 

  1. Open the document library where you want to apply the default Retention label.Img6
  2. Click the Settings icon (gear icon) in the top right corner and then select “Library settings” 

    Img7

  3. Then click on more library settings. 

    Img8

  4. On the Settings page, under Permissions and Management, select “Apply label to items in this list or library. 

    Img9

  5. On the Apply Label page, select the drop-down box, then select the label that you want to apply. 

    Img10

  6. The label you select will be automatically applied to all new files added to the document library beginning now. (Optional: To automatically apply the label to all files currently in the document library, select Apply label to existing items in the library.) 

    Img11

  7. Click “Save” to confirm the changes. 

 

Label Restrictions: What You Can’t Do with Labelled Items 

When an item has a retention label, some actions might be restricted depending on the settings configured by your administrator. These restrictions help ensure proper information governance and compliance. 

The most common restrictions with labelled items involve: 

Deleting the item: Retention labels often prevent accidental deletion of important information. 

Modifying the content (unless unlocked): Items designated as records might be locked to prevent unauthorized edits. You might be able to unlock the item for editing, as explained in the next section.

When an item has a retention label, some actions aren’t allowed—depending on the settings your admin has chosen.

If you see a message in SharePoint that you can’t edit a labelled item, it’s because it has been labelled a record. You might be able to edit this item if you first unlock it. See the next section for instructions.

Locking and Unlocking Records: Explained 

Retention labels can “lock” an item because it’s designated as a record that must be preserved. This prevents accidental deletion, which is crucial for important documents like contracts. However, authorized users might be able to unlock these items for revisions or updates. 

Why Records Get Locked: 

  • Retention labels can be configured to mark specific documents as “records.”
  • When an item is designated as a record, it’s essentially locked to prevent accidental deletion or modification. 
  • This safeguards crucial information and ensures regulatory compliance.

Here’s how to lock or unlock a record: 

  1. In the document library, select the item you want to lock or unlock.

    Img12

  2. Near the upper right of the window, select Open the details pane

    Img13

  3. In the Details pane, under Record status, select Locked. 
  4. A toggle control appears next to the name of the setting. Click the toggle to switch from Locked to Unlocked, or vice versa.

    Img14

  5. Select the toggle control to switch from Locked to Unlocked, or vice versa. 

Once the item is unlocked, then you can edit it. When you’re done editing, you can also lock the item again by following the steps above and toggling the Record status back to Locked. 

Enhance Power BI Reports – Implement Dynamic Colors Based on Selected Measures

Embark on a journey to enhance your Power BI reports with dynamic color adjustments based on selected measures through our informative blog. Uncover the intricacies of implementing this feature, transforming your visuals into interactive and responsive elements that effectively convey the real-time variations within your dataset. 

Why Dynamic Colors? 

Static reports can feel monotonous, failing to capture the nuances of your data. Dynamic colors breathe life into your visuals, instantly reflecting changes in the chosen measure. This adds a layer of interactivity and clarity, making your reports more engaging and informative for your audience. 

Prepare your data: 

Ensure that your measures are clearly named and readily available in your dataset. Here, prepare a spreadsheet for “Equipment Data” to represent “Equipment Type,” “Number of Units,” and “Order Value.

Load your data into PowerBI:  

Go to “Get Data” under “Home” and upload your created “Equipment Data” sheet. 

Create Measures according to your data: 

Create two measures – one for “Number of Orders” and “Order Value in $”: 

Number of Orders = COUNT(‘Product Data'[Order ID]) 

Order Value in $ = SUM(‘Product Data'[Order Value]) 

Create a parameter: 

Head to the “Modeling” tab and click “New Parameter.” Choose a descriptive name (here written “Select Matrix”) and add your measures as options – “Number of Orders” and “Order Value in $”. 

 

IMG01 3

 

Starting with data visualization: 

Take two visuals from “Visualizations”, one “Stacked Column Chart” and another “Clustered Bar Chart” to display data visually. 

 

IMG02 1 1

 

Add columns on axis: Add the created “Select Matrix” to the Y-axis for both graphs. 

 

IMG03 1 1

 

Define conditional formatting and create measures for colors: 

Select your visual and open the “Format” pane. Click “Conditional Formatting” and choose “Field Value” as the formatting style. 

Create measures to change the colors of visuals. You can adjust them according to your data. 

Status color = 

sql 

Copy code 

VAR selected_measure = SELECTEDVALUE(‘Select Matrix'[Select Matrix Order]) 

RETURN 

IF(selected_measure = 0, “#909fe8”, “#2b56ba”) 

Add color measure on columns: 

Select the chart and go to “Visual” under the “Visualization section”. Expand “Columns” and set the default as mentioned below. 

 

IMG04 1

 

Though it is feasible to integrate all functionalities within a single measure, for the sake of simplicity, you have the option to create separate measures—one to alter the color of the Number of Orders KPI card and another to modify the color of the Order Value KPI card. This approach provides a clearer and more organized method for customizing your Power BI visuals.

 

IMG05

 

Remember, the possibilities are endless. Experiment with different color palettes, explore advanced conditional formatting techniques, and let your creativity guide you. Your reports will become not just tools for analysis, but captivating narratives that inform, engage, and inspire action. 

Unlocking Automation – Trigger a Cloud Flow from Any Power BI Report

In today’s data-driven landscape, the integration of business intelligence tools and workflow automation has become paramount for organizations seeking to streamline processes and enhance decision-making. Microsoft’s Power BI and Power Automate, two powerhouse tools in the Microsoft 365 ecosystem, synergize seamlessly to empower users with insightful analytics and the ability to automate actions based on those insights.

In this blog, we will explore how to trigger cloud flow in a Power BI report and export the results into a spreadsheet.

Add Power automate visual in PowerBI report

Start by adding the “Power Automate” visual to your Power BI report. First you need to add “Power Automate” visual as shown in below.

IMG01 1

Add fields on visual

Once added, customize it by selecting the specific fields you want to extract for your spreadsheet.

IMG02 1

Create Automate flow on Power Automate visual

Now, head over to Power Automate and edit the “Power Automate” visual you just added.

IMG03 1

Click “New” and create an “Instant cloud flow”.

IMG04 1

Next, add a “Create CSV table” action and feed it an expression that pulls data from the Power BI visual.

triggerBody()?[‘entity’]?[‘Power BI values’]

IMG05 1

Add “Create file” (Onedrive) action and add input as below –

  1. Folder Path – Select folder where you need to exporting spreadsheet
  2. File Name – Write file name
  3. File Content – select “output” of “Create CSV table” action

IMG06 1

Triggering the Automation Machine

Now, design your “Power Automate” visual to trigger the flow whenever you filter data in your Power BI report.

Simply click “Trigger flow to export report,” and your filtered data will be automatically exported to your chosen OneDrive folder in a spiffy CSV file.

IMG07 1

Take advantage of the seamless collaboration between Power BI and Power Automate to turn your insights into automated actions, simplifying processes, and extracting maximum value from your data. Now, dive in and discover the myriad possibilities that await!

Developer’s Guide to SPFx Exploring the Key Files in SharePoint Framework

SPFx developers must grasp the layout of SharePoint Framework. This means understanding the different types of files included and what each file does. 

This blog post will cover the various files that make up an SPFx solution. 

To give you a quick overview, SPFx is a fresh framework from Microsoft designed to enhance SharePoint and Teams capabilities. Using SPFx, you can employ any JavaScript framework such as React, Angular, or Handlebars to effortlessly craft visually appealing and responsive web parts. 

Screenshot of all files within the SPFX solution displayed below. 

Image 01

 

Image 02

Here are five fundamental folders and other files visible: 

  1. .vscode 
  2. config 
  3. node_modules 
  4. sharepoint/assets 
  5. src 

Let’s analyse all the folders and files nested within this directory. 

  1. .vscode 
    This folder typically contains settings and configurations for Visual Studio Code, such as launch configurations, task definitions, and other editor-specific settings.  This folder contains two files (1)launch.json (2)setting.json

    • launch.json:
      This file is used to configure debugging settings for Visual Studio Code. It defines how the debugger should launch and attach to processes when debugging your code. For SPFx projects, launch.json can be configured to launch and debug your web parts or extensions in SharePoint workbench or in a local workbench.
    • settings.json
      This file is used to configure settings for Visual Studio Code on a project-specific level. It allows you to customize various editor settings such as font size, tab size, auto-save behaviour, and more. You can also define workspace settings in this file, which are specific to the current project/workspace.

  2. Config
    This folder may contain configuration files for the SharePoint Framework (SPFx) project, including build configurations, environment settings, and any custom configurations needed for the project.

    • Confin.json:

      Image 03
      This file
      contains configurations specific to the SharePoint Framework project, such as bundle configurations, asset locations, external dependencies, and other project-specific settings. It helps customize how the SPFx solution is built and deployed.


      Title: SPFx Configurations for Tableau Web Part

      • Schema and Version:
        – $schema: The URL to the JSON schema file defining the structure of this config file.    – version: The version of the config file. 
      • Bundles:
        – Description: Defines the bundles for your application customizer. – “react-analog-application-customizer”: Name of the bundle. – components: Array of components that belong to this bundle. – entrypoint: Entry point file for your application customizer. – manifest: Manifest file for your application customizer. 
      • Externals:
        – Description: Defines external dependencies for your project. Currently empty. 
      • Localized Resources:
        – Description: Defines the localization settings for your application customizer strings. – ReactAnalogApplicationCustomizerStrings: Identifier for the localized resources. – Points to the localization files for different locales.

    • Deploy-azure-storage.json:

      Image 04
      This configuration file is typically used in SharePoint Framework (SPFx) projects to automate the deployment process of assets to Azure Storage.
       
      Title: Configuration for Azure Storage Deployment 

      • Schema:
          – $schema: The URL to the JSON schema file defining the structure of this config file. It ensures the file adheres to a specific format.
      • Working Directory:
           – Description: Specifies the directory where the files to be deployed are located.    – workingDir: Path to the directory containing the assets to be deployed. 
      • Azure Storage Account Details:
           – Description: Defines the Azure Storage account details for deployment.    – account: Name of the Azure Storage account where files will be deployed.    – container: Name of the storage container within the specified storage account where files will be deployed. 
      •  Access Key:
          – Description: Defines the Azure Storage account details for deployment.    – account: Name of the Azure Storage account where files will be deployed.    – container: Name of the storage container within the specified storage account where files will be deployed. 
      •  Access Key:
        – Description: Specifies the access key required to authenticate and access the Azure Storage account. – accessKey: Access key associated with the Azure Storage account.Note: This configuration file is used for deploying assets to Azure Storage. It specifies the working directory containing the assets to be deployed, the Azure Storage account name, the container within the storage account, and the access key for authentication.

    • Package-solution.json:

      Image 05 1

      • Schema:
        $schema: The URL to the JSON schema file defining the structure of this config file. It ensures the file adheres to a specific format.
      • Solution Details:
        – name: Name of the SharePoint Framework solution package.
        – id: Unique identifier for the solution package. – version: Version of the solution package. – includeClientSideAssets: Indicates whether to include client-side assets in the package. – skipFeatureDeployment: Indicates whether to skip feature deployment during package installation. – isDomainIsolated: Specifies whether the solution package is domain isolated. – developer: Information about the developer including name, website URL, privacy URL, terms of use URL, and MPN (Microsoft Partner Network) ID.
      • Metadata:
        – shortDescription: Short description of the solution.
        – longDescription: Long description of the solution.
        – screenshotPaths: Paths to screenshots illustrating the solution.
        – videoUrl: URL to a video demonstrating the solution.
        – categories: Categories to classify the solution.
      • Features:
        title: Title of the feature provided by the solution.
        – description: Description of the feature.
        – id: Unique identifier for the feature.
        – version: Version of the feature.
        – assets: Assets associated with the feature, such as element manifests. 
      • Paths:
        zippedPackage: Path to the zipped solution package (.sppkg file).

    • Sass.json:

      Image 06

      The sass.json file is used in SharePoint Framework (SPFx) projects to configure the Sass (Syntactically Awesome Style Sheets) preprocessor options. Sass is a CSS preprocessor that adds power and elegance to the basic CSS syntax. The sass.json file allows developers to define settings for compiling Sass files into CSS during the build process. Here’s what it typically contains.

      • includePaths:
        An array of paths where the Sass compiler should look for imported files. It’s useful for importing Sass files from node_modules or other directories.
      • precision:
        Defines the number of decimal places to which rounding will occur during calculations. This setting can prevent precision loss in certain calculations, especially when dealing with responsive design or transformations.
      • outputStyle:
        Specifies the output style for the compiled CSS.

    • Serve.json:

      Image 07
      This configuration file specifies the serve settings for SharePoint Framework projects. It defines the port, HTTPS usage, and different serve configurations for various scenarios. Each configuration includes details such as the SharePoint page URL to load and any custom actions to apply during serving for specific scenarios.

      • Schema:
        – $schema: The URL to the JSON schema file defining the structure of this config file. It ensures the file adheres to a specific format.
      • Port and HTTPS Configuration:
        – port: Specifies the port number on which the local development server will listen. 
         – https: Indicates whether HTTPS protocol should be used for serving the project locally. 
      • Serve Configurations:
        – Description: Defines different serve configurations for different scenarios or environments. 
        – default: Default serve configuration. – pageUrl: URL of the SharePoint page to load when serving the project. – customActions: Configuration for custom actions to be applied when serving the project. – location: Specifies the location of the custom action. – properties: Custom properties to be passed to the custom action. – reactAnalog: Serve configuration specific to a scenario named “reactAnalog”. Similar structure to the default configuration.

  3. Node_modules:
    These are third-party libraries and packages that are used in SPFx projects for various purposes like building, testing, bundling, and serving the project. Examples include webpack, gulp, react, office-ui-fabric-react, etc.
    The node_modules folder can be quite large because it contains all the dependencies required for development. However, it’s essential for SPFx development as it provides the necessary tools and libraries to build and run SharePoint Framework projects.
    When you create a new SPFx project or clone an existing one, you typically don’t need to include the node_modules folder in version control systems like Git. Instead, you include a package.json file, which lists all the dependencies required for the project. Other developers can then run npm install to download and install the necessary dependencies based on the package.json file. This approach helps keep the project repository size manageable and ensures consistency across development environments.

  4. Sharepoint\assets:
    In short, the Sharepoint\assets folder in SPFx solutions is used to store static assets such as images, fonts, CSS, and JavaScript files. These assets are automatically hosted by SharePoint when the solution is deployed, making them easily accessible for use in SPFx components like web parts or extensions. It helps organize assets, improves performance, and ensures they are scoped to the site collection where the solution is deployed.

  5. Src:
    • Localization Folder (loc):
      Description: –This folder holds language localization files, such as “en-us.js” or “de-DE.js”, along with an interface file. These files allow developers to support multiple languages for web parts.
      Purpose: –Developers can define language-specific strings within these files, enabling web parts to adapt to different language preferences.
    • Web Part Manifest File (webpart.manifest.json):
      Description: –This folder holds language localization files, such as “en-us.js” or “de-DE.js”, along with an interface file. These files allow developers to support multiple languages for web parts. 
      Purpose: –Developers can define language-specific strings within these files, enabling web parts to adapt to different language preferences.
    • Web Part Styling File (webpart.module.scss):

      Image 08
      Description:-This file, written in SCSS (Sassy CSS), provides styling instructions for the web part.
      Purpose: –SCSS offers advanced features for styling, enhancing the visual presentation and consistency of the web part across different devices and screen sizes.

    • Web Part Entry Point (webpart.ts):

      Image 09
      Description: –the entry point file defines the primary functionality and configurations for the web part. It typically contains a web part class extending BaseClientSideWebPart.
      Purpose: – This file governs the core behaviour of the web part, including property pane configurations and interactions with SharePoint data.

    • Components Folder (if using React):
      Description: –This folder, specific to React framework usage, houses state and props interface files, along with a SCSS file. It also includes a crucial file named “webpart.tsx”.
      Purpose: –In React-based projects, this folder encapsulates the component-based structure, facilitating the creation and management of reusable UI elements. “webpart.tsx” serves as the main React component file, orchestrating the rendering and functionality of the web part.  

Other files in solution :

index.ts: Entry point for TypeScript modules, organizing and managing module dependencies. 

.eslintrc.js: Configurations for ESLint, ensuring code consistency and identifying errors. 

.gitignore: Specifies files to ignore in Git, keeping the repository clean and manageable. 

.npmignore: Specifies files to exclude in npm packages, reducing package size and improving installation speed. 

.yo-rc.json: Stores project scaffolding configurations for future reference and consistency. 

gulpfile.js: Contains tasks and configurations for Gulp, automating build processes and enhancing productivity. 

package.json: Metadata file for npm packages, defining dependencies, scripts, and project information. 

package-lock.json: Locks dependencies’ versions for reproducible builds, ensuring consistency. 

README.md: Documentation file, providing project information, setup instructions, and guidelines. 

tsconfig.json: TypeScript compiler options and configurations for the project, guiding compilation process. 

Transform Your SharePoint Online List Forms: A Guide to JSON Customization and Validation

Customizing SharePoint online forms is typically done using SPFx or Power Apps. However, this blog will demonstrate an alternative approach using JSON to customize out-of-the-box SharePoint list forms. This method eliminates the need for SPFx or Power Apps, as simple JSON can be employed to tailor the header, footer, and body of the form.. 
 
Through the use of JSON, users can create distinct sections within the form body and implement validations to dynamically show or hide form fields based on the values of other fields. Notably, the advantage of employing JSON for customization is the inherent responsiveness of the forms. Unlike other methods, there is no additional effort required to make these forms responsive, allowing seamless utilization across both desktop and mobile platforms. 
 
In this blog on SharePoint Online, User will explore the customization and formatting of SharePoint Online list forms through the use of JSON formatting. And column validation in SharePoint online list. 

Img 01 1

 

First, familiarize yourself with customizing JSON formatting. Customize the form header, form footer, and form body. Divide the body into different sections.

  1. Customize the SharePoint Online List Form Header using JSON formatting.

2024 06 14 18 05 10

 

In the Format window, you’ll encounter options such as Header, Body, and Footer. Ensure that you choose Header.

Img 02 1

In the “Formatting code” option, apply or paste the JSON formatting code provided below. Alternatively, you can retrieve this code by clicking on “Learn more” within the JSON code section. 

{ 

    “elmType”: “div”, 

    “attributes”: { 

        “class”: “ms-borderColor-neutralTertiary” 

    }, 

    “style”: { 

        “width”: “100%”, 

        “margin-bottom”: “15px”, 

        “background-color”: “purple” 

    }, 

    “children”: [ 

        { 

            “elmType”: “div”, 

            “style”: { 

                “display”: “flex”, 

                “align-items”: “center” 

            }, 

            “children”: [ 

                { 

                    “elmType”: “div”, 

                    “style”: { 

                        “flex”: “none”, 

                        “padding”: “0px”, 

                        “padding-left”: “10px”, 

                        “height”: “40px”, 

                        “color”: “white” 

                    } 

                } 

            ] 

        }, 

        { 

            “elmType”: “div”, 

            “attributes”: { 

                “class”: “ms-fontColor-themePrimary ms-borderColor-themePrimary ms-fontWeight-bold ms-fontSize-xl ms-fontColor-neutralSecondary–hover ms-bgColor-themeLight–hover” 

            }, 

            “style”: { 

                “box-sizing”: “border-box”, 

                “width”: “100%”, 

                “text-align”: “center”, 

                “padding”: “21px 12px”, 

                “overflow”: “hidden”, 

                “color”: “white” 

            }, 

            “children”: [ 

                { 

                    “elmType”: “div”, 

                    “txtContent”: “Task Details” 

                } 

            ] 

        } 

    ] 

}
 

You can customize the header by adjusting parameters such as background color, text color, and adding icons. Additionally, you can retrieve dynamic values from the body using this JSON formatting. 

 2. Customize the SharePoint Online List Form Body using JSON formatting. 

Once the “Configure layout” option is chosen, select the “Body” option from the dropdown in the “Apply formatting to” section. 

Img 03 1

In the “Formatting code” option, apply or paste the JSON formatting code provided below. Alternatively, you can retrieve this code by clicking on “Learn more” within the JSON code section. 

{ 

    “sections”: [ 

        { 

            //give a display name for the section 

            “displayname”: “✈️”, //Instead of icon, user have the option to use names of sections  . 

            “fields”: [ 

                //reference your fields here using their display name 

                “Airline”, 

                 “Estimated airfare” 

            ] 

        }, 

        { 

            //give a display name for the section 

            “displayname”: “🏡”, //Instead of icon, user have the option to use names of sections. 

            “fields”: [ 

                “Hotel”,
                “Estimated hotel cost”//reference your fields here using their display name 

            ] 

        } 

    ] 

} 

Change the given code to establish distinct sections within the form. Each section allows for the inclusion of an icon alongside its name, mirroring the example depicted below. Additionally, you have the flexibility to add any column to any field within a section. 

Img 04 1

 

When attempting to edit the columns in the form, it is evident that the fields are presented in a section-wise manner, as demonstrated below: 

img 05 1

 

 

3. Customize the SharePoint Online List Form footer using JSON formatting. After choosing the “Configure layout” option, select the “Footer” option from the dropdown in the “Apply formatting to” section. 

Img 06 1

In the “Formatting code” option, apply or paste the JSON formatting code provided below. Alternatively, you can retrieve this code by clicking on “Learn more” within the JSON code section. 

{ 

    “elmType”: “div”, 

    “style”: { 

        “height”: “24px”, 

        “width”: “100%”, 

        “color”: “#fff”, 

        “font-size”: “15px”, 

        “border-top”: “5px solid #eee”, 

        “background-color”: “Purple”, 

        “padding”: “10px” 

    }, 

    “children”: [ 

        { 

            “elmType”: “div”, 

            “style”: { 

                “width”: “100%”, 

                “padding-top”: “10px”, 

                “height”: “24px” 

            }, 

            “children”: [ 

                { 

                    “elmType”: “a”, 

                    “txtContent”: “=’Task Review for ‘ + [$Title]”, 

                    “attributes”: { 

                        “target”: “_blank”, 

                       “href”: “=’https://aka.ms/task?review=’ + [$Tasks]” 

                    }, 

                    “style”: { 

                        “color”: “white” 

                    } 

                } 

            ] 

        } 

    ] 

} 

In the screenshot below, you can observe that the JSON code for the Footer has been included. I have implemented a functionality where clicking on the specified text will navigate the user to the email administrator.

Img 07 1

 

Show/Hide form field based on conditions: 

To meet the requirement of displaying the Manager field in the form only when the Priority is set to Critical, follow these steps: 

  • Click the “New list” button to open the list form. 
  • Choose “Edit columns” from the options available in the top-right corner of the New form. 
  • Select column which you want to hide/show  and click on three dots and select “Edit Conditional Formula”. 

Img 08 1

Following that, you will encounter the field where you can input your formula. As an example, here is the code for the “Estimated Airface” field: when a specific value is selected in the “Airline” column, the estimated airfare will be displayed; otherwise, it will be hidden. 

Img 09 1

Please note that you can find and obtain a solution by clicking on the line highlighted in green, which reads, Learn to use conditional formulas in a list form. 

How to do column validation in SharePoint: 

  • Column validation serves to ensure that the data entered in a specific column is accurate and meets the defined criteria. 
  • Use Cases/Formulas- 
    • In my SharePoint list, there is a column named “Travel Start Date,” and I aim to enforce a rule that this date must be greater than or equal to today’s date.: =TravelStartDate>TODAY(). 
      • First, enter the Edit mode for that column and then click on “More options.” And click on “column validation”
        Img 10

        Img 11

 

  • Subsequently, insert the formula in this section, and ensure to include a User message. If the conditions specified in the formula are not met, this message will be displayed to the user.

    Img 12

Inline column validation in SharePoint will not be effective for this situation since it doesn’t allow retrieving or comparing values from other columns. To address this limitation, it is necessary to utilize list-level validation settings, as illustrated in the image below. This approach enables the use of multiple formulas to compare values from different columns. 

Img 13

 

In the image above, the user can specify a formula ensuring that the travel end date is greater than the travel start date. Additionally, the sum of Estimated Airfare and Estimated Hotel Cost must be greater than zero. It’s crucial to understand that Estimated Airfare and Estimated Hotel Cost are individual columns, and their respective formulas won’t work in column validation settings. To overcome this, these formulas must be written in the list setting validation. Moreover, when employing multiple formulas, it’s important to write user messages separately, as demonstrated in the image, as there isn’t a distinct method to display messages within the formula. 

Data Migration Best Practices for 2024 with Tzunami

Data Migration Best Practices for 2024

As we move forward through 2024, the landscape of data migration has transformed dramatically, reflecting the rapid advancement of technology and the exponential growth of data. Today, organizations are compelled to migrate data not just for modernization, but as a strategic move towards efficiency, and innovation. The journey to this point has been marked by learning from past challenges, adopting new technologies, and refining strategies to ensure seamless, secure, and effective data migration. This blog post will outline essential migration best practices, focusing on planning, detailed risk assessment, and the adoption of cutting-edge technologies. These practices underscore the importance of understanding your data landscape and ensuring compatibility between source and target systems, Moreover, we will dive into the latest future trends in data migration in 2024.

8  Data Migration Best Practices 

While there is no universal consensus on the migration best practices, most top technology companies agree on several top priorities. These practices are not just advice and tips but can serve as a guide and blueprint for companies before, during and after migration.

1. Create a methodology

Adopting an efficient methodology is crucial for the success of any data migration project. A striking statistic from a Bloor report highlights that 38% of data migration projects exceed their allocated time or budget, underscoring the challenges inherent in these endeavors. 

Among the methodologies available, the Practical Data Migration (PDM) approach, developed by industry veteran Johnny Morris, stands out. This methodology not only offers a structured process but also includes training and certification, providing a comprehensive framework for managing migration projects.

Another methodology is: the Information Asset Inventory (IAI), this approach is a systematic method used by organizations to identify, classify, and manage their information assets effectively. This process is critical for understanding the scope of an organization’s data, ensuring proper data management practices, and supporting strategic decisions regarding data security, compliance, and risk management. Here’s how it typically unfolds and why it’s essential, especially from an IT manager’s perspective:

  • Identification: Cataloging all the information assets within the organization. This includes databases, files, and applications.
  • Classification: Assigning a value and sensitivity level to each asset based on its importance to the business, regulatory requirements, and potential impact of compromise or loss.
  • Ownership: Determining who within the organization is responsible for managing, securing, and maintaining each asset.
  • Protection: Assessing current security measures and identifying any gaps in the protection of these assets to ensure they are safeguarded against threats.

2.Analyze your data migration project

Assessments serve as the critical groundwork for any data migration project. Conducting a thorough migration assessment equips organizations with a comprehensive understanding of the entire data transfer process, including the transition of data across different locations, formats, and systems. It’s essential for these assessments to pinpoint potential risks and advantages, identify the current storage locations of data or systems, and determine their migration destinations. The proposed mapping between source and destination is key to planning and executing the migration.

Furthermore, the assessment phase should clarify whether to employ an all-at-once or a phased or gradual migration strategy. It’s also the time to outline the necessary budget and timeframe, establishing a detailed project timeline early in the process.

Evaluating essential elements such as backup procedures, target systems, security measures, and support infrastructure is indispensable during a data migration assessment. Organizations must also consider possible cloud provider downtime, and be aware that sometimes the cloud service might be unavailable.  While less of a concern in the cloud, any planned downtime, both in source and destination, needs to be planned for and accommodated.

 “Downtime is an inevitable reality for almost every business,” said Yonatan Hatzor, co-founder and CEO of Parametrix, a company specializing in cloud downtime insurance. “As companies increasingly rely on cloud providers for day-to-day operations, they are exposed to the real risk of downtime.”

According to a report by Parametrix, the major cloud service providers, commanding over two-thirds of the global cloud market, experienced 1,190 performance interruptions in their cloud infrastructure in 2022. Out of these, 492 disruptions were deemed critical. The report also highlights that approximately one-third of these incidents affected cloud regions in the United States, with the remaining critical disruptions distributed fairly evenly across Europe, Asia, and other parts of the world.

3.Engage stakeholders outside of the IT department

Engaging with stakeholders beyond the IT department is crucial when initiating the planning and design phases of a migration project. It’s essential to inform all employees about the potential impact of the migration on their daily tasks and to offer them a chance to share their input.

Frequently, staff members from departments other than IT play a pivotal role in the migration effort. Their insights into the practical usage of data assets and systems in their daily operations, as well as their expectations for improvements following the migration, can significantly inform and enhance the migration strategy.

4.Test in-depth in a controlled environment

When exploring the capabilities of a new system, it’s crucial to ask its providers if they offer tools that can enhance user confidence through repeatable and user-friendly testing mechanisms. Inquire whether any new system providers offer tools designed to bolster user confidence, such as the ability to conduct repeatable system tests. Ideally, these tests should be streamlined to minimize the need for expert users to enter extensive details, facilitating ease of use and efficiency in evaluating the system’s reliability and performance.

5.Backup your valuable data before the migration process 

Throughout the migration process, it’s imperative that the original source data or system remains unaltered, regardless of any significant data issues that may arise. To safeguard against any potential damage, alterations, or corruption, companies must perform comprehensive backups to guarantee the availability of reliable data copies when necessary.

The primary risk associated with data migration lies in the potential loss of business-critical assets or the improper handling of sensitive data. By maintaining data backups on distinct and highly secure systems, companies can effectively mitigate the risks of errors or data loss during the migration.

Cloudsfer’s cloud-to-cloud backup solution stands out in this context, offering a robust safety net by creating reliable and secure copies of data across different cloud platforms. Such backups are essential, especially considering the high stakes of losing business-critical assets or mishandling sensitive data during migration. Cloudsfer’s solution provides an additional layer of security, enabling companies to navigate the complexities of data migration with confidence, knowing they have a dependable recovery option in the event of any unforeseen errors or data loss.

6.Prioritize security throughout the process

Data encryption is a critical component of data migration, serving as a cornerstone for ensuring the security and integrity of data as it moves from one system to another. When data is encrypted, it’s converted into a secure code that can only be accessed or decrypted by users with the correct encryption key. This process safeguards sensitive information from unauthorized access, data breaches, and other cyber threats, especially during the vulnerable phases of transmission and storage in new environments.

In the context of data migration, encryption plays a dual role. First, it protects data in transit between the old and new systems, ensuring that any data intercepted during the migration remains unreadable and secure. Second, encryption secures data at rest within the new system, preventing unauthorized access even after the migration is complete.

In Tzunami Deployer, the data migration process to SharePoint Online, utilizing the Microsoft 365 migration API, is securely encrypted end-to-end. This ensures that from the moment data leaves the original source until it is safely stored in SharePoint Online, it remains protected and inaccessible to unauthorized parties. Tzunami Deployer emphasizes customer data privacy and security, asserting that it does not access or process customer data directly. The solution operates offline, behind the company firewall. Minimizing the risk of data exposure and enhancing the security posture of the migration process. This approach underlines Tzunami Deployer’s commitment to maintaining the integrity and confidentiality of customer data throughout the migration journey.

7.Test your migration continuously

Organizing your migration into distinct phases is advisable. Migrating to the cloud typically takes time according to the volume of the data to migrate. However, for larger organizations dealing with vast amounts of data and complex systems, this timeline can extend further. By segmenting the migration into smaller, manageable stages, you can more effectively monitor progress and identify any obstacles early on. Conducting live testing at each stage of the migration is beneficial. These tests provide a practical way to verify your initial assessments and plans against the realities of the migration process and to evaluate the performance of systems once they have been migrated. Additionally, these tests offer the opportunity to refine your migration strategy in real time, ensuring a more controlled and efficient process.

Tzunami provides extensive, detailed migration reports during the export, deployment, and migration processes, enabling users to ensure all content is successfully migrated.

8.Migration optimization is necessary

Migration optimization directly contributes to the efficiency and effectiveness of your applications and services, which may be at the heart of your customer’s experience with your brand. From an operational perspective, you can achieve better performance, cost savings, and improved resilience by fine-tuning your cloud infrastructure post-migration. It also helps mitigate any potential data loss or downtime risks.

After migrating your data to SharePoint with the Tzunami migration tool, you can optimize your SharePoint environment with Reality Tech. As an award-winning provider, Reality Tech excels in SharePoint deployments, upgrades, workflows, and custom solutions. By focusing on core business operations and enhancing collaborative workflows, Reality Tech leverages its deep understanding of the business landscape to deliver solutions that significantly enhance efficiency, speed, and agility.

Deep dive into data migration trends for 2024

Data migration continues to be a critical process for businesses of all sizes. In 2024, we can expect to see some exciting trends emerge that will shape how organizations move their data between systems. Here’s a closer look at three key areas:

      1.Cloud Migration Takes Center Stage:

Businesses are rapidly embracing the cloud for its scalability, cost-efficiency, and flexibility. This naturally translates to a surge in data migration projects by 2024.  Organizations will move data from on-premises storage to cloud platforms like SharePoint migration to the cloud, Office 365, Autodesk Construction Cloud (ACC), and migrate file server  AWS, or Azure to unlock these benefits.  Hybrid and multi-cloud strategies will also gain traction, requiring robust tools to handle complex data movement across diverse environments.

      2.Automation: The Efficiency Engine:

Manual data migrations are cumbersome and error-prone. To address this, 2024 will see a rise in automation tools that streamline the process. These tools can automate tedious tasks like data extraction, transformation, loading, and validation. 

      3.Security: A Constant Vigil:

Security remains paramount even with the ease of cloud migration and automation.  Organizations will be laser-focused on data privacy and compliance throughout the process. This includes robust encryption, access controls, and leveraging data residency options offered by cloud providers. Microsoft and SharePoint Online are at the forefront of the market in terms of data security, offering robust protection measures that meet the needs of today’s digital landscape

In conclusion, the evolution of data migration strategies in 2024 has underscored the critical importance of planning, comprehensive risk assessment, and the integration of advanced technologies. Through adopting methodologies like Practical Data Migration and Information Asset Inventory, organizations have enhanced their ability to manage complex data landscapes efficiently. As we move forward, these best practices not only serve as a foundation for successful data migration projects but also as a testament to the ever-evolving nature of technology and its role in shaping the future of business operations.

How to use dynamic email template in Power Automate

SharePoint is a strong tool for teamwork, with many features to help teams work efficiently. One useful feature is creating templates that automatically get content from lists, which helps users avoid doing the same tasks over and over. 

In this blog post, we’ll show you how to make a template in SharePoint that pulls content from a list and sends it out by email using Power Automate. This integration between SharePoint and Power Automate can save you time and resources while making sure the right people get the important information quickly. 

In the SharePoint list, we create two lists. The first list is named “Email Template.” In this list, we create a column called “Template,” which is a multiple lines of text column. We add an email template to the “Template” column by creating a new entry in this list. 

Additionally, we create a template where we retrieve four values from another list: “Purchaser Name,” “Delivery Date,” “Product Name,” and “Purchaser Name.” 

IMG 01

 

The second list is named “Online Order Product Details,” where we create a “Delivery Date” column as Date and Time. Additionally, we have a “Product Name” column, which is a choice column, and we include the default “Created By” column provided by SharePoint. 

In the template, we replace “Purchaser Name” with the name of the creator, retrieved from the “Created By” column in the list. The “Delivery Date” is substituted with the dynamic value from the “Delivery Date” column in the “Online Order Product Details” list. Similarly, the “Product Name” column is replaced with the dynamic value from the “Product Name” field in the “Online Order Product Details” list.

Step:1 Add the “When an item is created” trigger action and specify the site URL and list name from which dynamic values are retrieved. The flow will be triggered when an item is created or modified in the SharePoint list. 

IMG 02

 

Step:2 Add the “Get Items” action, specifying the site address and list name where the user creates the template in the list. 

IMG 03

 

Step:3 Add the compose action and create the expression to first replace the Purchaser name with the Created By display name. 

Add the following expression in the compose action: 

First, add the “Replace” value. Then, include the “Get Items” action to retrieve the value. Here, obtain the data at index “0” to retrieve the number 1 ID. Next, specify the column name where we create the template in multiple lines of text. Add a comma after this, and then include the dynamic name you wish to use to replace the original data. Add a comma after the dynamic name. Finally, include the “Created By” display name, which retrieves data from another list. 

replace(outputs(‘Get_items_-_Email_Template)?[‘body/value’][0][‘Template’],'[Purchaser Name]’,triggerOutputs()?[‘body/Author/DisplayName’]) 

IMG 04

 

Step:4 Add a new “Compose” action. Use the “Replace()” expression. Insert the compose output from dynamic content inside the brackets. Specify the name to replace the delivery date. Add the delivery date value. 

replace(outputs(‘Compose_-_Replace_the_Purchaser_name‘),'[Deliver Date]‘,triggerOutputs()?[‘body/DeliveryDate‘])

IMG 05

 

Step:5 Add a new “Compose” action. Use the “Replace()” expression. Insert the compose output from dynamic content inside the brackets. Specify the name to replace the Product Name. Add the Product Name value. 

replace(outputs(‘Compose_-_Replace_with_the_Delivery_date‘),'[Product name]‘,triggerOutputs()?[‘body/ProductName/Value’]) 

IMG 06

 

Step:6  Add a new “Compose” action. Use the “Replace()” expression. Insert the compose output from dynamic content inside the brackets. Specify the name to replace the Purchaser Name. Add the Created by display name value. 

replace(outputs(‘Compose_-_Replace_with_the_Product_Name‘),'[Purchaser Name]‘,triggerOutputs()?[‘body/Author/DisplayName’]) 

IMG 07

 

Step:7 Add the “Send an email” action. Fill in the required fields. Add the last compose output to the email body. Save and test the flow. Show the output. Users can change the template in the list without manually updating Power Automate each time. 

IMG 08

 

Step:8 Users can add other dynamic values in SharePoint and compose actions step by step. Save and run the flow to see the email output.

IMG 09

 

Conclusion: By establishing a SharePoint template to extract dynamic content from a list and automating email distribution through Power Automate, businesses boost efficiency and productivity. This integration saves time, ensuring data accuracy and consistency. As organizations strive for workflow optimization, SharePoint and Power Automate provide a robust framework. This solution caters to reporting, notifications, and communication, offering flexibility and scalability for managing information flow within teams and across departments. 

How to add Multistep Form in Power Pages.

Multistep forms in Power Pages are a powerful tool for collecting user input in a clear and organized way. They allow you to break down complex data collection processes into manageable steps, improving the user experience and increasing completion rates. These forms offer flexibility, allowing you to customize the number of steps, utilize conditional branching, and even track user progress to ensure a smooth and efficient data-gathering experience. 

To create a multistep form, first need to create a basic form. After creating the basic form, you can then proceed to create the multistep form. Follow the steps below after creating the basic form to create the multistep form.

Step 1: Go to the Power Pages Management

IMG01

Step 2: Click on the “Multistep Forms” and then click on the “New” button.

IMG02

Step 3:
Provide a name for your multistep form. Then, select the site where you want to create it, save your settings, and finally click on the “Form Step” tab to proceed with the creation of individual steps for your multistep form.

IMG03

Step 4:
In the “Form Step” section, click on “New Form Step”, provide the name of the step, and select the table where your basic form resides that you want to add to this step. Then click on the form definition, select the mode you want, choose the form and tab, and click on “Save”.

IMG04

Step 5:
To set the initial step for the multistep form, go to your multistep form configuration. There, you can specify which step the user will see first when accessing the multistep form. This initial step defines where the user starts in the multistep process after accessing the form.

IMG05

Step 6:
After completing the first step, the user needs to proceed to the next step. To set up the next step, return to the first step, and in the ‘Next Step’ section, select the step that you want the user to proceed to after completing the first step. Once you’ve selected the appropriate step, click on ‘Save’ to confirm the changes.

IMG06

Step 7: If you want the user to be redirected to the homepage after completing the multistep form, create one more step in the multistep form. Give it a name, and in the type, select “Redirect”.

IMG07

Step 7:  Go to the “Redirect” tab and provide the URL and webpage where you want the user to be redirected after completing the form. Then, save the changes

IMG08

Step 8: Go to the second step, and in the settings for that step, select the created redirect step as the next step. Then, click on save.

IMG09

Step 9: After creating the Multistep form, navigate to the Power Pages site. Click on the Multistep form from the components, and you will see the created Multistep form. Add that form to the site.

IMG010

IMG011

Step 10: Sync the site and preview it to see the added Multistep form.


Conclusion
:

Mastering the creation of multistep forms in Power Pages offers a powerful solution for enhancing user interaction and data collection on web pages. By breaking down complex processes into manageable steps, these forms streamline the user experience and improve completion rates. With the step-by-step guide provided, users can easily create and customize multistep forms to suit their specific needs, from defining form steps to setting up conditional branching and user redirection. By implementing these techniques, web developers can create clear and organized data collection processes that enhance user satisfaction and engagement.