Last week we were welcomed by our member Shelton Fleming to the innovation lounge of Engage Works to look towards the future. Around the room were various different technologies that guests were encouraged to have a go on, from glasses that can take pictures to screens that are sensitive to the movement of your eyes.

Andrew Reid (Director, Strategy, Marketing and Digital Services, Shelton Fleming) introduced the afternoon’s programme, which had been designed to “elevate conversations” around the topic of AI and how it’s impacting the events industry and content generation. Shelton Fleming’s Head of Innovation Paul Hannah then went on to define AI and where it is now. Currently, he said AI was at a stage called Artifical ‘Narrow’ Intelligence, meaning AI can problem solve based only what is given to them. The next stages are Artificial ‘General’ Intelligence which means AI will be able to infer knowledge, and then Artificial ‘Super’ Intelligence which, whilst concerning in some ways, could be leveraged to solve massive problems that require a an ability to ingest colossal amounts of data and find patterns.

In the world of events and exhibitions, AI can be used to personalise experiences for attendees without asking more of the organiser. It can take on mundane tasks to free up time, and mine data then digest patterns into actionable results. Other exciting possibilities include wearable tech that can do real-time translation allowing people to communicate even when they have no language in common, as well as personal AIs that hope to be alternatives to a phone.

We were then split into teams and, led by one of our inaugural shadow board members and Creative Technologist Dan Rhodes, were given the task of creating a trailer for a new podcast. We used chatGPT to come up with a name, theme and tagline for our podcast, and to help us write a script for our audio trailer. Through clear instruction chatGPT edited our script as well, updating the script to reflect the voice of our presenter and adding in some humour to finish. We were then able to produce a cover image based on our script and tagline, and choose an actor’s voice to read our script out over it. It was fascinating to see what the different software were able to do, where it was most useful and where it was more limited. It was also an eye-opener into how specific you need your language to be when approaching these platforms, to get the most out of them.

It was an illuminating day that made AI technology feel much less intimidating, which was reflected in an 83% uplift on AI understanding in events across the course of the afternoon. This was measure by a new measurement tool that Shelton Fleming have created called Focal. You can view the full Focal report here: FutureHack24-Focal-Report.

Terms & Conditions | Privacy & Cookie Policy | Company No. 00897631, 1st Floor, 23 Golden Square, London, W1F 9JP.