For an industry that prides itself at being at the forefront of innovation, 2024 has been a challenging year. Cybercrime has become more advanced, data protection has become more complex, and Generative AI remains a frustratingly elusive silver bullet. Still, most of the conversations revolve around AI. What are the lessons to be learnt to make 2025 a better year?
Let’ start with Generative AI:
While generative AI remains a key topic in B2B tech, the industry is far from mastering it. Whether companies have tried adding it to their products or internal operations, the complexities of getting it to work effectively are diminishing the returns on investment.
It’s been a wake-up call for those companies who believed GenAI would be a quick fix, creating efficiencies and saving costs. Gaps in data and the resistance by staff to adapt and embrace AI has slowed progress.
Also, not all companies have had a well-defined strategy on how they want to integrate GenAI, and many are still figuring it out. But those companies that have, are getting ahead. The pressure is on to overcome stumbling blocks and make GenAI work more effectively.
What about data and AI governance?
As much as the industry is embracing the capabilities of AI, concerns remain about data protection and AI governance. More specifically around the fact that there’s no clear legislation guiding what’s considered to be ethical use of AI.
The challenge is that AI development is not slowing because of it, which could potentially add to the risks. Meanwhile data protection requirements are getting more stringent.
It’s a bit of a chicken and egg situation with tech firms caught in the middle. They can’t afford to compromise data, but neither can they sit back and wait until AI governance is hashed out, as that could take years. For now, with the demise of third-party cookies, in B2B tech marketing at least, the focus is on having clean and accurate data sets. It’s not a bad thing, but it’s requiring some effort to achieve.
More advanced but less secure.
Cybersecurity continues to be a hamster wheel, with threat actors getting more bold, more aggressive and less selective of their targets. Even the most advanced technologies are proving to be vulnerable, especially with cybercriminals using AI as their partner in crime.
To counter this, cybersecurity is fighting back with AI, using it to identify abnormal patterns of behavior or vulnerabilities that put systems at risk. As much as there’s the argument that AI is speeding up response time to incidents, by being able to identify breaches more quickly, cybersecurity continues to get more complex.
A single or even dual approach to security is no longer sufficient. Multiple layers and methods of security are required to keep ahead of risks, and even then, there’s no guarantee that they’re fail safe. It’s adding cost and complexity to technology at a time when companies are desperately looking for greater efficiency. That’s something to keep in mind moving into 2025.