businesspeople working finance accounting analyze financi jpg

Part 8 of 11: Ongoing Optimisation and Management

Merge Cross-Channel Data and Systematically Test New Ideas with AI Design Sprint™

In the ever-evolving landscape of digital marketing, ongoing optimisation and management are crucial for maintaining competitive edge and ensuring sustained success. However, many agencies struggle with effectively merging cross-channel data and systematically testing new ideas, leading to fragmented insights and missed opportunities. Implementing robust optimisation strategies is essential to refine campaigns, enhance performance, and drive continuous improvement, yet achieving this consistently can be challenging.

Today, I want to share how the AI Design Sprint™ can transform your ongoing optimisation and management processes. By integrating AI, agencies can seamlessly merge cross-channel data, systematically test new ideas, and implement data-driven optimisations. The result? Enhanced campaign performance, deeper insights, and a proactive approach to managing and improving marketing efforts.

If your agency is grappling with fragmented data, inconsistent optimisation practices, or inefficient idea testing, this article is for you.

Let’s explore how you can refine your ongoing optimisation and management strategy with precision and innovation.

Understanding the Agency

The Agency That Struggled with Ongoing Optimisation and Management

Let me take you on a journey about an agency that was experiencing rapid growth. They began as a small digital marketing firm but soon expanded their services to include comprehensive branding, advertising, content creation, SEO, and social media management. With this growth came new roles: Optimisation Manager, Data Analyst, Campaign Manager, Creative Director, Performance Specialist, and Project Managers. Exciting times, right? But with great growth came significant challenges.

Their ongoing optimisation and management processes were fragmented. Merging data from multiple channels was cumbersome, leading to incomplete insights. Systematically testing new ideas was inconsistent, causing missed opportunities for improvement and innovation. Additionally, the lack of standardised optimisation procedures sometimes resulted in stagnant campaign performance, risking client satisfaction and agency reputation. Sound familiar?

Symptoms of Inefficiency

Every optimisation cycle felt disjointed, making it tough to gain comprehensive insights across channels. Multiple team members conducting data analysis without a unified process led to inconsistent findings and conflicting recommendations. In the excitement to test new ideas, they occasionally overlooked critical data points, risking ineffective optimisations. Manual tracking and fragmented testing processes stretched the time from idea generation to optimisation implementation, making them less competitive.

The Six Staff Members: Why Each Role Matters

Let’s meet the stars of their story. The Optimisation Manager (OM) oversees the optimisation strategies but struggles with consolidating diverse data inputs. The Data Analyst (DA) conducts in-depth data analysis but finds it challenging to integrate findings from various channels seamlessly. The Campaign Manager (CM) manages campaign performance but is often delayed by incomplete data insights. The Creative Director (CD) ensures that optimisations align with creative visions but is hampered by fragmented data. The Performance Specialist (PS) tracks key performance indicators but is limited by inconsistent optimisation practices. Lastly, the Project Manager (PM) coordinates the optimisation process, battling disjointed workflows and communication gaps to keep everything on track.

Bringing these roles together in an online AI Design Sprint™ workshop was a game-changer. Facilitated by a certified expert, their goal was clear: overhaul their ongoing optimisation and management process, unify team inputs, and embed AI-driven efficiencies from the ground up.

The Steps

Transformations like these need a roadmap. They followed a structured AI Design Sprint™ process, ensuring every detail was covered and every voice was heard.

Mapping the Current Ongoing Optimisation and Management Process (Steps 1–3)

Step 1: Check Whether the Process Describes Reality

They kicked things off with Zoom and Miro, mapping out their existing optimisation and management workflows. Everyone shared how data was merged, ideas were tested, optimisations were implemented, and performance was tracked. Within an hour, their current state looked something like this:

  1. Data Collection: Gathering data from various marketing channels.
  2. Data Integration: Merging cross-channel data into a unified dashboard.
  3. Idea Generation: Brainstorming new optimisation ideas.
  4. Idea Testing: Implementing A/B tests and other experimentation methods.
  5. Performance Analysis: Analyzing the results of optimisation tests.
  6. Reporting: Compiling performance reports for internal review and client presentation.
  7. Feedback and Adjustment: Making necessary adjustments based on performance data.
  8. Implementation: Rolling out successful optimisations across campaigns.
  9. Ongoing Monitoring: Continuously tracking campaign performance and making incremental improvements.
  10. Final Evaluation: Assessing overall optimisation success and financial outcomes.

During this step, the team discussed how each stage was executed and identified discrepancies between the documented process and actual practices. This conversation highlighted areas where informal practices deviated from the intended workflow, providing a realistic view of their ongoing optimisation and management process.

Step 2: Mark and Describe Pain Points with Red Post-Its

Using red sticky notes, they pinpointed the major obstacles. The lack of standardised data integration templates led to inconsistency during data merging. Manual optimisation processes created bottlenecks in implementing tests. Messaging inconsistencies during performance reporting sometimes led to misinterpretations, and tracking campaign performance became a mess during ongoing monitoring stages.

The team shared their frustrations and experiences, discussing how these pain points affected their daily work and the overall efficiency of the optimisation and management process. This open dialogue ensured that everyone was on the same page regarding the challenges they faced.

Step 3: Mark High-Value Points with Green Dots

They then prioritised the most critical issues with green dots. Standardised data integration templates, a streamlined optimisation testing process, consistent performance reporting, and automated performance tracking emerged as their focus areas for AI integration.

In a lively discussion, team members voted on which pain points would deliver the most significant improvements if addressed. This prioritisation helped them focus their efforts on the areas that would make the biggest impact.

AI Cards and Prioritisation (Steps 4–5)

Step 4: Go Through AI Cards Two at a Time, Copy Them If Relevant

They explored AI solution categories. AI Workflow Automation systems streamline data integration and optimisation testing processes. AI Data Integration solutions consolidate cross-channel data into unified dashboards. AI Content Generation tools assist in drafting consistent performance reports. AI Performance Analytics ensure accurate tracking and reporting of campaign outcomes.

The team examined each AI category, discussing how these solutions could address their identified pain points. By understanding the capabilities of each AI tool, they could better match them to their specific needs.

Step 5: Prioritise the Three Most Important AI Cards for Each Step

Through dot-voting, they landed on AI Workflow Automation to streamline optimisation testing, AI Data Integration to unify cross-channel data, and AI Performance Analytics to automate performance tracking and reporting. After reviewing the options, the team collectively agreed that these three AI solutions would most effectively tackle their top pain points. This consensus ensured that their focus was aligned and that they were prioritising solutions with the highest potential impact.

Focusing the Sprint (Steps 6–7)

Step 6: Select Two Process Steps to Focus On

They decided to zero in on the data integration process and the optimisation testing phase. Ensuring accurate and consistent data merging and streamlining the testing of new ideas were identified as the most impactful areas for improvement.

The team discussed which stages of the optimisation and management process would benefit most from AI integration. By focusing on these two critical areas, they aimed to create a solid foundation for improving overall efficiency.

Step 7: Move and Reformulate the Two Selected Focus Points

These became their anchor points: the AI-Powered Data Integrator, which ensures seamless and accurate merging of cross-channel data, and the Automated Optimisation Tester, which systematically tests new ideas with minimal manual intervention.

During this step, they redefined their focus areas to clearly outline how AI would enhance each process. This clarity helped the team understand the specific roles AI would play in their ongoing optimisation and management workflow.

Integrating and Examining Neighbouring Steps (Steps 8–9)

Step 8: Integrate the Anchor Points into the Existing Process

They introduced the AI-Powered Data Integrator to merge data from various marketing channels based on standardised templates and client inputs. Simultaneously, they implemented the Automated Optimisation Tester to manage the testing of new ideas, track results, and ensure timely implementation of successful optimisations.

They mapped out how these AI tools would fit into their current workflow, ensuring a seamless transition. The team discussed the integration points and how each role would interact with the new systems.

Step 9: See How Neighbouring Steps Are Influenced

AI can now suggest optimal data merging configurations based on campaign data during the data integration stage. Consistent performance reporting enhances the clarity and professionalism of their client communications. Additionally, automated tracking ensures timely and organised performance monitoring, reducing missed data points and delays.

The team evaluated the ripple effects of integrating AI into their optimisation and management process. This foresight helped them anticipate and address potential changes in related stages, ensuring a holistic improvement.

Big-Picture Review & People-AI Interactions (Steps 10–12)

Step 10: Rethink the Entire Process

They stepped back to ensure their revamped workflow addressed the initial pain points without introducing new challenges. It was about making sure AI would enhance their efficiency and consistency while keeping their strategic and user-centred approach intact.

The team engaged in a thorough review, discussing whether the new integrations aligned with their overall goals and maintained their agency’s unique value proposition. This step was crucial for ensuring that the AI tools complemented rather than compromised their optimisation and management processes.

Step 11: Add How and Where People Could Help AI Perform Well

The Optimisation Manager (OM) oversees the AI-Powered Data Integrator, providing strategic inputs and refining outputs. The Data Analyst (DA) collaborates with AI tools to conduct in-depth data analysis and personalise optimisation insights. The Campaign Manager (CM) ensures AI-generated optimisations align with campaign goals. The Creative Director (CD) integrates data-driven optimisations with creative strategies. The Performance Specialist (PS) tracks key performance indicators using AI analytics. Lastly, the Project Manager (PM) utilises the Automated Optimisation Tester to manage and implement successful optimisations.

They delineated each team member’s role in the AI-enhanced process, ensuring that everyone knew how to interact with the new tools effectively. This clarity fostered ownership and accountability within the team.

Step 12: Mark Sketches Where a Person Interacts with AI

They created a flow diagram to visualise interactions. The Data Analyst inputs campaign data into the AI-Powered Data Integrator. The Optimisation Manager reviews and personalises AI-generated optimisations. The Campaign Manager ensures alignment with campaign goals. The Performance Specialist utilises the AI Performance Analytics to monitor results. The Project Manager manages the Automated Optimisation Tester to implement successful optimisations.

By mapping out these interactions, the team gained a clear understanding of their touchpoints with the AI tools. This visual aid facilitated smoother collaboration and highlighted areas where human input was essential.

Refining Anchors & Tasks (Steps 13–14)

Step 13: Revisit Anchor Points and Improve Descriptions

They refined their anchor points to ensure clarity and precision. The AI-Powered Data Integrator now seamlessly merges cross-channel data based on standardised templates, client inputs, and data-driven insights. The Automated Optimisation Tester manages the testing of new ideas, tracks results, and ensures timely implementation through AI-driven task assignments and notifications.

This step helped the team fully grasp the functionalities and benefits of each AI tool, facilitating better implementation.

Step 14: Break Each Anchor Point into Four Sub-Steps

For the AI-Powered Data Integrator:

  1. Template Configuration: Define standardised data integration templates with consistent metrics and parameters.
  2. Campaign Data Integration: Input campaign-specific information and data sources into the AI system.
  3. Data Merging: AI seamlessly merges cross-channel data based on templates and campaign data.
  4. Human Verification: Data Analyst reviews, edits, and personalises the AI-generated data integrations.

For the Automated Optimisation Tester:

  1. Submission for Testing: New optimisation ideas are automatically routed to the Automated Optimisation Tester.
  2. Idea Testing: AI conducts systematic tests on new ideas using A/B testing and other methodologies.
  3. Result Analysis: AI analyzes test results and generates optimisation reports.
  4. Final Implementation: Optimisation Manager oversees the final implementation process, ensuring successful optimisations are rolled out across campaigns.

Breaking down each anchor point into actionable sub-steps provided a clear roadmap for implementation. The team could now follow these detailed steps to ensure each aspect of the process was addressed effectively.

Final Declarations & Ethical Checks (Steps 15–17)

Step 15: Restate the AI Technologies Used

They utilised AI Workflow Automation through machine learning algorithms for data integration and optimisation testing. AI Data Integration consolidated cross-channel data from various sources, enhancing the depth and accuracy of insights. AI Performance Analytics ensured accurate tracking and reporting of campaign outcomes.

They reiterated the technologies involved to ensure everyone was on the same page regarding the tools they were implementing. This step reinforced the technical foundation of their new process.

Step 16: Give the New Process a Catchy Name

They named their new process “OptiSync,” highlighting the seamless synchronization and intelligent automation it brings to their ongoing optimisation and management workflows.

Choosing a memorable name helped the team embrace the new process. It fostered a sense of identity and ownership, making it easier to communicate and refer to the system internally.

Step 17: Individually Select Relevant AI Ethics Cards

They selected Transparency, Privacy, Fairness, and Accountability. Transparency ensures they clearly communicate AI involvement in data integration and optimisation testing. Privacy safeguards client and campaign data within the AI systems. Fairness avoids biased data analysis and ensures equitable treatment of all optimisation efforts. Accountability maintains human oversight over AI-generated optimisations and tracking processes to ensure accuracy and integrity.

Addressing ethical considerations was paramount. The team discussed how to uphold these principles, ensuring their AI integrations were responsible and trustworthy.

Scenario Planning & Realignment (Steps 18–21)

Step 18: Write Best- and Worst-Case Scenarios

In the best-case scenario, OptiSync seamlessly merges cross-channel data, enhances optimisation accuracy, and reduces optimisation and management time by 50% within three months, leading to a 20% increase in client satisfaction and retention. In the worst-case scenario, AI-generated optimisations are inaccurate, leading to ineffective campaign adjustments and a decline in client trust.

The team brainstormed potential outcomes, both positive and negative. This foresight helped them prepare strategies to maximise benefits and mitigate risks.

Step 19: Reflect on Implications for the AI Solution

To mitigate risks, they implemented rigorous Quality Assurance measures to ensure accuracy in AI-generated data integrations and optimisations. Continuous Training of AI models with new data improved relevance and authenticity. Additionally, Feedback Loops encouraged team feedback to refine AI tools continuously.

They discussed practical steps to address the worst-case scenarios, ensuring that their AI tools remained effective and trustworthy.

Step 20: Share with the Team and Align

They held an all-hands virtual meeting to present OptiSync. Each role understood their responsibilities and the importance of collaboration in the AI-enhanced workflow. The team was excited and committed to the new process.

Sharing the plan with the entire team fostered alignment and enthusiasm. Everyone felt involved and motivated to contribute to the successful implementation of the new system.

Step 21: Go Back to the Solution and Refine

Based on team feedback, they integrated advanced data verification features for Data Analysts to easily modify AI-generated data integrations, simplified the Automated Optimisation Tester for easier navigation and management, and scheduled comprehensive training sessions to ensure all team members were proficient with the new tools.

Refining the solution based on feedback ensured that the tools were user-friendly and met the team’s needs effectively. This iterative improvement was key to smooth adoption.

Note: Typically at this point a “Tech Check” is done by either external partners or internal tech teams to see if the proposed solution is viable from a technical point of view. If yes, there would be an agile prototype phase to develop and fine-tune before going into full development and integration.

Pilot & External Feedback (Steps 22–24)

Step 22: Pitch the Solution to External Users

They rolled out a pilot version of OptiSync with select clients. Clients appreciated the data-driven and comprehensive optimisation insights, but some desired more personalised optimisation recommendations, prompting further refinement.

Engaging external users provided valuable insights into how their optimisation and management processes were perceived, highlighting both strengths and areas for improvement.

Step 23: Seek Feedback

They compiled feedback into a Miro board, highlighting strengths such as improved data integration accuracy and reduced optimisation time, alongside areas for improvement like enhanced personalisation and flexibility in optimisation strategies.

Organising feedback systematically allowed them to identify common themes and prioritise adjustments effectively.

Step 24: Discuss Feedback and Improve the Concept

Key adjustments included configuring AI tools to incorporate client-specific nuances and preferences, enabling the AI-Powered Data Integrator to learn from past successful optimisations and client feedback for better future outputs, and improving the workflow interface’s usability based on client and team suggestions.

Implementing these refinements ensured that OptiSync was more responsive to client needs and user-friendly for their team. With these refinements, OptiSync was ready for full-scale implementation.

The Final Solution – “OptiSync”

Recap of Pain Points Addressed

By the end of their 24-step journey, they had successfully tackled several key issues. Fragmented Data Integration was resolved through standardised data integration templates and AI-powered data merging, ensuring uniformity and accuracy across all campaigns. Inconsistent Optimisation Testing was unified with AI Workflow Automation, bringing together optimisation processes from Optimisation Managers and Data Analysts seamlessly. Tracking Errors were mitigated by AI Performance Analytics that incorporated checks to prevent data discrepancies. Finally, Time-Consuming Processes were significantly reduced through AI Workflow Automation, streamlining the ongoing optimisation and management timelines.

The Pilot & Next Steps

Next Steps

Looking ahead, they plan to integrate machine learning models to enhance data integration with more nuanced and personalised insights. They aim to incorporate deeper personalisation features to cater to diverse client needs and implement AI-driven analytics to predict optimisation success rates and optimise their strategies accordingly.

Why a Step Process Works

Thoroughness vs. Speed

While the temptation to deploy AI solutions rapidly is strong, the complexity of ongoing optimisation and management demands a thoughtful approach. Their structured AI Design Sprint™ ensured comprehensive problem identification, tackling root causes rather than just the symptoms. Systematic AI exploration allowed them to methodically evaluate AI solutions to find the perfect fit. Inclusive team involvement engaged all relevant roles, fostering ownership and collaboration. Robust testing and feedback integration validated solutions through pilot testing and iterative refinements.

Everyone Sees Their Part in the Puzzle

From the Optimisation Manager overseeing AI-powered data integration to the Analytics Specialist leveraging unified data for strategic decisions, every role understood their contribution. This holistic involvement ensured OptiSync was seamlessly integrated and embraced across the agency.

AI with Purpose, Not Gimmick

Each AI solution directly addressed specific pain points. AI Workflow Automation enhanced consistency and efficiency in data integration and optimisation testing. AI Data Integration unified cross-channel data for coherent and comprehensive insights. AI Performance Analytics ensured accurate tracking and reporting of optimisation outcomes. Ethical considerations, such as transparency and accountability, were integral, ensuring AI served as a tool for empowerment rather than replacement.

Key Takeaways

The Resulting “OptiSync”

Their final OptiSync includes standardised data integration templates, ensuring consistent quality and accuracy across all optimisation processes. The AI-Powered Data Integrator seamlessly merges cross-channel data based on standardised templates and campaign inputs. Human Verification allows Data Analysts to personalise and enhance AI-generated data integrations. The Automated Optimisation Tester systematically tests new ideas, tracks results, and ensures timely implementation of successful optimisations. AI Performance Analytics tracks and reports campaign performance accurately. Unified Data Integration consolidates inputs from Optimisation Managers, Data Analysts, and Campaign Managers into a single optimisation system. Lastly, their ethics framework maintains transparency about AI involvement and ensures client data privacy.

Lessons

Tailoring AI tools to address specific pain points rather than adopting generic solutions was crucial. Engaging all relevant roles ensured a comprehensive and cohesive strategy implementation. Upholding ethical standards maintained trust and integrity in their AI-driven processes. Continuous feedback and iterative refinement enhanced the effectiveness and adaptability of their AI tools.

Forward-Looking Upgrades

With OptiSync in place, their agency plans to develop a machine learning-based data integrator to enhance personalisation and contextual relevance. They aim to expand to multimedia data integration by incorporating AI tools for generating and integrating multimedia performance metrics. Additionally, integrating predictive analytics will help forecast optimisation success rates and proactively adjust their strategies.

Last Thoughts

In the fast-paced world of digital marketing, blending data-driven insights with strategic optimisations is key. Integrating AI into your ongoing optimisation and management strategy isn’t just about keeping up with trends—it’s about strategically implementing tools that solve real challenges and drive meaningful results.

For agencies struggling with fragmented data and inconsistent optimisation practices, embracing a structured AI Design Sprint™ can lead to a unified, efficient, and data-driven optimisation ecosystem. Gather your team, dive into a collaborative AI Design Sprint™, and watch your ongoing optimisation and management efforts transform into a synchronised and impactful force.