Digitalisation

Where is AI really heading?

If you work in electronics design, you’re already feeling the pressure to put AI in your product.

Avnet recently released its fifth annual Avnet Insights survey which shows how your peers are actually deploying AI today – especially at the Edge – and what will matter most to your designs over the next 12–24 months.

The picture that emerges is not of a short‑lived boom, but of a steady, structural shift driven by better tools, maturing Edge silicon, and a hard focus on return on investment.

I spoke with Alex Iuorio, Senior Vice President, Global Supplier Development, Avnet to learn more.

Alex Iuorio, Senior Vice President, Global Supplier Development, Avnet

AI adoption

The survey results showed a clear rise in the share of engineers embedding AI functionality into their equipment, particularly in Edge and microcontroller-based designs. Globally, over half of engineers (56%) said they are shipping products to customers with AI incorporated into the designed products and solutions, a 33% increase from last year.

Interestingly, Iuorio views this growth as expected, not explosive and what’s driving this isn’t just enthusiasm – it’s the reinforcing loop of tools, hardware, and know‑how.

For design engineers, the implication is simple: AI capability is becoming a default expectation in many product categories, not a differentiator reserved for a few.

“AI has passed the proof-of-concept stage as more engineers continue to work the technology into products being shipped to customers,” said Iuorio.

The Edge comes of age

Between 2024 and 2025, Iurio has noticed the rapid advance of computational capabilities at the Edge – specifically, neural networks running on chips that look very familiar to embedded designers.

Engineers are increasingly combining Edge AI and ML models to deliver greater functionality and value in their designed products and solutions, as over half of respondents (57%) said they are prioritising the incorporation of Edge AI and ML equally, underscoring the importance of both technologies as more look towards multi-modal AI solutions.

Vendors are rolling out microcontrollers and processors explicitly optimised for AI workloads and wrapping them with secure, Cloud‑connected development environments.

One example discussed was Avnet’s IOTCONNECT platform which enables OEMs to easily build and deploy secure, innovative apps featuring comprehensive analytics.

In practice, this means you can:

  • Prototype with AI‑optimised parts out of the box
  • Stream data to a supplier’s hosted algorithms, get results back into your board
  • Iterate on data‑rich, AI‑driven applications without having to build the full stack yourself

If you’re used to traditional embedded workflows, this is a significant shift: the silicon, tools, and Cloud services are increasingly co‑designed to accelerate Edge AI.

Data quality: the quiet, persistent bottleneck

Despite the hardware/software advances, the survey confirms that data quality remains a top design challenge – and has for several years.

Almost half of surveyed engineers (46%) cited data quality issues as one of the top design challenges when integrating AI into designed products and solutions, as integration with existing tools (38%) and high costs (37%) take the next two spots. These were the same top three challenges as in 2025.

“These are not new challenges, but they are now at the forefront because of the very nature of AI,” Iuorio. “Engineers are working with massive data sets, and the quality of that data dictates how precise the outcomes will be. Identifying these problems is the first step toward solving them, both from the perspective of the engineers and the companies that support them. These challenges are not insurmountable, and here at Avnet, we can work with our customers and suppliers to ensure that their engineers are best supported in confronting them.”

Data issues can be split into at least two domains that matter to you as an engineer:

  1. Secure development/characterisation data – how you understand and validate parts and systems in controlled environments
  2. Operational and business data – the live data that feeds your models in deployment

On the development side, secure platforms that characterise devices on your behalf can dramatically cut risk.

On the operational side, a major theme is the gap between public and private AI offerings. Engineers cited using ChatGPT (69%), Google Gemini (57%), and Microsoft Copilot (50%) as go-to sources. However, they are turning to these tools out of necessity. Only 16% of global engineers said they would prefer to use a publicly available LLM to answer technical questions, but almost half (47%) would prefer to use an LLM trained by engineers outside of their organisation, illustrating a gap in available tools.

For design engineers, this translates to a need to design for data as much as for circuitry: sensor placement, telemetry, on‑device preprocessing, and secure update paths all determine whether your system ever sees ‘good’ data in the field.

Where AI will hit first

When engineers were asked which AI trend will have the biggest impact in the next 12 months, responses spread across 14 categories, indicating broad experimentation.

“I don’t necessarily think that we’ll have one category or one application that emerges and rules all. You’ll see the growth in all applications – provided there’s discernible return on investment,” said Iuorio.

The top three embedded AI deployments with the highest adoption rates, process automation (42%), predictive maintenance (28%), and fault/anomaly detection (28%), were consistent from last year.

For design engineers, that suggests two practical filters for AI features:

  1. Can you demonstrate a measurable gain (throughput, yield, uptime, headcount reduction)?
  2. Can you instrument and monitor that gain in the deployed system?

Features that pass both tests are more likely to survive budget scrutiny.

What you should prioritise in the next 12-24 months

Overall, global engineers are embracing the new technology, but they are still determining its impact and the skills needed for success.

Iuorio’s advice to engineers boiled down to two priorities:

  • Automate – use AI to govern and optimise production and operational processes where ROI is clear
  • Embed – build AI capabilities into your end products, especially at the Edge

10 years ago, the big shift was simply getting devices connected – the “rise at the Edge.” Today, connectivity is assumed. The next competitive layer is on‑device intelligence: “If connectivity [is universal], then the next step seems [to be] ‘I want to buy some intelligence.’ So that’s what engineers have to focus on, getting those capabilities into their products.”

In crowded markets, arriving as the third or fourth AI‑capable product may not be enough. Early, well‑executed designs are likely to shape expectations.

For engineers, the take‑home is clear: the hardware, tools, and ecosystems for AI at the Edge are now mature enough that “waiting to see” is itself a risk. The next 12-24 months will favour teams that can pair solid design with a pragmatic approach to data, power, and ROI‑driven AI features.