Dynatrace: multifaceted approach for AI in 2024

 n 2024, generative AI will enter the final stages of its hype cycle, and organizations will realize that the technology, while transformative, cannot deliver significant value on its own. As a result, they will move towards a multifaceted approach to artificial intelligence that combines generative AI with other types of AI and additional data sources . This is the main trend for 2024 detected by Dynatrace.

This approach will enable more advanced reasoning and add precision, context and meaning to the results produced by generative AI. For example, DevOps teams will combine generative AI with fact-based causal and predictive AI to power digital innovation by predicting and preventing problems before they arise and generating new workflows for automate the software distribution lifecycle.



AI-generated code will create the need for digital immune systems

In 2024, more and more organizations will experience serious disruptions to digital services due to poor quality and insufficiently supervised software code.

Developers will increasingly use autonomous agents powered by generative AI to write code for them, putting their organizations at increased risk of unexpected issues impacting customer and user experiences. Indeed, the challenge of maintaining code generated by autonomous agents is similar to that of preserving code created by developers who have left an organization. None of the remaining team members fully understand this code. Therefore, no one can quickly fix problems in the code when they arise. Additionally, those attempting to use generative AI to examine and troubleshoot code created by autonomous agents will find themselves facing a recursive problem, as they will continue to lack the fundamental knowledge and understanding needed to manage it effectively.

These challenges will push organizations to develop a digital immune system , combining software design, development, operations and analysis practices and technologies to protect your software from the inside out by ensuring code resilience through default. To do this, organizations will leverage predictive AI to automatically predict issues in code or applications before they appear and trigger an immediate, automated response to protect the user experience. For example, development teams can design applications with self-healing capabilities. These features enable automatic rollback to the latest stable version of the code base if a new version introduces errors, or automated provisioning of additional cloud resources to support an increase in demand for computing power.

Organizations will appoint a Chief AI Officer to oversee the safe and responsible use of artificial intelligence

In 2024, organizations will increasingly appoint senior executives to their leadership teams to prepare for the impacts of AI on security, compliance and governance.

As employees become more accustomed to the use of AI in their personal lives, through exposure to tools such as ChatGPT , will seek to use AI more to increase their productivity at work. Organizations have already understood that if they do not officially authorize their employees to use AI tools, they will do so without consent. Organizations will therefore appoint a Chief AI Officer (CAIO) to oversee the use of these technologies in the same way that many companies have a Chief Security Officer, or CISO , on their leadership teams. The CAIO will focus on developing policies as well as training and empowering staff to use AI safely to protect the organization from accidental non-compliance, intellectual property leaks or security threats. These practices will pave the way for widespread adoption of AI within organizations. As this trend progresses, AI will become a commodity, just like the mobile phone.

Data observability will become a requirement

In 2024, data observability will become mandatory as organizations need to implement smarter automation and faster decision-making.

The volume of data continues to double every two years, and organizations are looking to acquire and analyze it faster and at larger scale. However, the cost and risk of poor data quality is greater than ever. In a recent survey, 57% of DevOps professionals said that lack of data observability makes it difficult to conduct automation in a compliant manner. As a result, organizations will increasingly need solutions that ensure data observability, allowing them to quickly and securely acquire reliable, high-quality data ready for analysis on demand.

Better data observability will enable users such as IT operations and business analytics teams to understand data availability as well as the structure, distribution, relationships, and lineage of that data across all sources, including multiple platforms in distributed hybrid and multi-cloud environments. This understanding is essential for generating information that users can trust by ensuring data is up to date, identifying anomalies, and eliminating duplicates that could cause errors.

Organizations will extend observability to more business use cases as management levels seek to support sustainability and FinOps goals.

By 2024, the combined pressure to adopt more environmentally friendly business practices and address rising cloud costs will move observability from an IT priority to a business requirement.

The increased use of AI by organizations will be a key driver of this trend, as it will lead to increased consumption of cloud resources, leading to an increased carbon footprint. However, AI-driven observability data analysis can help organizations address these challenges and evolve their practices. FinOps and sustainability by surfacing actionable insights and powering intelligent automation to address areas of inefficiency in cloud environments. Increased use of AI-driven observability will enable organizations to automatically orchestrate their systems for optimal resource utilization, reducing emissions and costs of managing their cloud environments. As a result, we will see increasing interest in observability use cases outside of IT as the enterprise as a whole begins to take notice.

Platform engineering will become fundamental

In 2024, platform engineering will become fundamental. Organizations will recognize that a secure and functional software delivery pipeline is just as vital to any of them. business continuity as well as the quality and security of the digital services that end users and customers rely on. We will therefore see an evolution towards the production of tools to facilitate best engineering practices. DevOps , security and site reliability. This will bring platform engineering to the forefront as organizations codify the know-how and capabilities needed to automate secure software delivery pipelines. As this trend takes hold, software delivery and security processes and operations will be enabled through application programming interfaces (APIs) that automate these tasks based on real-time insights from software data. observability.

Organizations will phase out legacy SIEM solutions as security teams seek more intelligent threat analysis.

By 2024, next-generation threat intelligence and analytics solutions will drive the phasing out of systems. SIEM (Security Information and Event Management) .

These modern solutions enable security teams to extend their capabilities beyond log analysis to access context provided by a wider range of data modalities and different types of artificial intelligence, including generative techniques, causal and predictive, working together. As a result, organizations will have access to deeper, more accurate, intelligent and automated threat analysis that will help them protect their applications and data against increasingly sophisticated threats.

No comments

Powered by Blogger.