For threat professionals, main by means of 2025’s volatility has been like dwelling in an “Alice in Wonderland” unreality. Danger groups have by no means been extra necessary as a operate to information their companies by means of challenges reminiscent of geopolitical threat occasions, commerce disruption, financial volatility, and regulatory disruption. Hopefully, this work doesn’t resemble the chasing of Lewis Carroll’s well-known White Rabbit. Our newest report, The State Of Enterprise Danger Administration, 2025, showcases quite a lot of knowledge insights and graphics on industrywide and programmatic shifts impacting enterprise threat administration (ERM) packages and the way threat decision-makers are responding to them. Our knowledge reveals that:
- Cyberattacks and tech dependency convey enterprise resilience to the fore. The UnitedHealth Group breach and the worldwide disruption triggered by the CrowdStrike software program replace had been good reminders concerning the vital function that know-how performs throughout our society. It’s thus unsurprising that 40% of native and 38% of multinational ERM leaders cited cyberattack velocity as a high threat driver. As well as, 36% of multinationals and 28% of native corporations flagged overreliance on tech as a serious threat. Danger leaders should map their software program provide chains and be certain that their resilience simulations cater to a spread of tech failures — not simply cyberbreaches.
- AI and third-party dangers stay heightened. Whereas monetary, commerce, and geopolitical dangers are dominating boardroom conversations, the actual shift is going on beneath the radar. Tech distributors are embedding generative AI into core programs and ERM groups are struggling to get entangled early sufficient within the course of to construct applicable guardrails in from the start. Third-party dangers usually are not receiving as a lot consideration as they require regardless of rising cyberattacks and programs failures linked to third-party suppliers, such because the current spate of cyberattacks within the UK retail sector. Danger professionals should prioritize speaking the ROI and worth of investing in and maturing each AI threat and third-party threat administration packages.
- Vital threat occasions are extra possible when ERM will not be a boardroom concern. Almost 75% of enterprises skilled no less than one vital threat occasion previously 12 months, and cyberattacks and IT failures account for most crucial occasions globally. Corporations with out board-level ERM visibility had been 20% extra more likely to endure six or extra vital occasions. Danger professionals have to deal with each getting ERM taken critically by the board but in addition getting the board to assist drive the best threat tradition throughout the group.
- Danger administration budgets are rising — however usually are not assembly the second that we’re in. Most ERM budgets are solely rising by 1–4%, barely maintaining with inflation. Solely 4% of corporations anticipate a better than 10% enhance. Many ERM packages nonetheless wrestle to show ROI or align with enterprise objectives, leaving many to query the worth past ticking regulatory compliance necessities. Chief threat officers want to point out how ERM drives enterprise worth — not simply compliance — to get the funding required to make better-quality threat administration choices.
- Figuring out rising dangers units ERM packages aside. Forrester purchasers have been telling us persistently that they need their threat operate to implement the best guardrails to permit the enterprise to confidently and shortly tackle dangers. Organizations keep in mind being caught out by ChatGPT and different rising applied sciences and need to remodel the engagement and notion of their groups. From our knowledge, solely 37% of threat decision-makers reported figuring out rising dangers as their major measure of success.
Forrester purchasers wanting to debate additional can e book a steerage session or inquiry to debate the analysis additional with any of the authors.