Written by Stephan Forseilles, CTO at Easyfairs, and Chair of the UFI Digital Innovation Working Group.

There is no doubt that AI has been on everyone’s mind in the past months. It seems every tradeshow organiser has started initiatives to probe, validate, experiment, investigate or even implement AI in their business environment.

However, as the majority seem to rely on external partners to innovate in that field, some have decided to mostly do it ‘in-house’ and to recruit the (rare) talents needed for this. My goal today is to look at both sides of the coin to understand the various options. For this, I want to share my own experience, as CTO of Easyfairs, but also the experience of some of my most innovative counterparts in the industry.

I must admit that I’ve always been fascinated by AI. Back at University in the mid-90s, I worked with some of the first usable neural networks of the time (developed by the University of Stuttgart, which still maintains a webpage about it that kept all of its wonderful early 90s feeling). Neural networks really were quite simple back then. The computing power we had at our disposal at the time (especially as students) did not allow us to create the deep learning algorithms that prove so useful today. However, we already had usable discriminative neural networks that could decipher handwriting, however painstakingly, letter by letter. But the seed was planted…

That could be the reason why, when A.I. came out as “the next big thing” in tech, we at Easyfairs decided, back in 2018, that it would be part of our “core business”. First step: set up an internal team in charge of creating a company-wide data warehouse that would collect data from all systems, current and historical. The goal was to have a repository that we could use to train deep learning algorithms (generative A.I.s were not yet a thing) to help in various aspects of our business. As a nice side-effect, this allowed us to have a “one version of the truth” data repository. But that’s another subject.

Since then, our internal teams of Data Engineers, Data Analysts and Data Scientists have grown to 10 people, and they are a vital part of our tech ecosystem. We have several deep learning algorithms in operations in our day-to-day business and, at any point in time, we have at least three to five new algorithms or generative AIs in development, testing or proof-of-concept phase. Of course, not all of them will make it to the “production stage”. But we’re learning, progressing and, most importantly, having fun!

Why do it “in-house”? For me, it has several advantages:

  • I work with people I like, who are part of my team and are fully dedicated to Easyfairs.
  • We build an invaluable experience that is ours. We don’t share it with anyone else (except you, of course).
  • We never hesitate to conduct one more experiment: the team is there anyway!
  • We have a very stable team in which people can learn and evolve over time.
  • The team is super specialised in one area: events! They know everything about the business, they understand absolutely every aspect of the available data.
  • Did I mention that I like working with them?

But ‘in-sourcing’ has downsides too. That’s why other CTOs have taken the decision to outsource AI. To better understand their reasons, I’ve asked some of the best ones why they took the decision to do it in-house or to outsource.

I discussed the matter with several “colleagues CTOs”:

Do you develop AI capabilities with an in-house team, or do you prefer to use external partners?

NEDVED: It is a good question. I think it depends… For the data platform part, i.e. collection, store, manage and governance etc, I plan to build it in-house. For data analysis, we will perform it in-house; while for data modelling, we will start with outsourcing or using existing products outside. After a while, we will see 1-2 key areas to in-source, if it can give us a huge competitive edge over others. For AI solution development, we will do the design and solutioning in-house, closest to the business users. The development part, most likely will be outsourced or done using existing products outside. Governance should be in house.

PATRICK: We develop AI capabilities within an in-house team. This is the case especially for engineers, data scientist and architects, where we add new increment “AI specific roles” and FTE. For other business functions its less incremental roles/FTE but training/replacing existing in-house roles with staff to get most value out of AI opportunities for their “already existing functions”.

ALISTAIR: We don’t currently have any in-house capability for ML modelling or AI creation. My preference would be to use commoditised AI as much as possible without needing to use in-house resources. However, it’s very clear that due to the very unique nature of B2B and B2C events business model there will always be the need to create bespoke ML models to fulfil the business needs.

Did you make that decision long ago?

NEDVED: Yes, most of it. But it changed quite a bit after generative AI emerged, where we have many more opportunities to utilize the products externally for process improvements.

PATRICK: In principle, we made the decision to have digital capabilities in-house, that we believe are and will be crucial for our core-business and long-term growth, may years ago. However, deciding that AI belongs to this “must-have category” was something we did in Q1 2023, based upon dynamic developments starting 2022/11. There are many other developments in digital that my become very important and disruptive for our business in the future, where we are observing the development but not investing money/huge amount of resources as of now (e.g. Metaverse, NFT, …).

ALISTAIR: No, the rate of change of the AI industry and the development of available commoditised AI tools in platforms such as AWS and GCP means that the decisions around this area have to remain fluid and agile.

Why this choice?

NEDVED: We don’t have the luxury of having a large development team here. Growing our AI modelling capability is quite challenging, considering the resources required, the competition for talent, and the retention concerns. It may be more cost-effective to allocate our limited in-house resources to those areas that require close interaction with the business team and where our competitive edge lies.

ALISTAIR: We need to start by getting the senior leadership engaged with the topic and to help define the direction – this could be internal on a small scale to begin with – I see us needing an experienced lead who could leverage commoditised AI platforms to run fast experiments before scaling up.

Are you happy with your choice?

NEDVED: We are currently in the early stages of consolidating enterprise systems and building up the data platform. More time is required for experimentation and adaptation.

PATRICK: Yes, we have to develop the additional capabilities in an “incremental way” anyway and always can adjust its speed / investment level here and there, since there is a lack of skilled professionals fitting our requirements in Germany anyway.

ALISTAIR: Yes

Do you intend to change your strategy in the future?

NEDVED: Yes, I would expect it will change when we are more into AI.

PATRICK: No

ALISTAIR: Yes – if the business can focus on defining the actual questions they want to answer I think this will inform the type and style of models we need which could change this decision.

As you can see, there’s more than one way to bake a cake.

What is your opinion? On which side of the playing field are you? Team “Insource”, team “Outsource” (or Team “Wait and see”)? I’m interested to hear your opinion in the comments!