A Showcase of Ambient Intelligence appliances

A Showcase of Ambient Intelligence appliances

It’s been a while since my last blog post. Those were some very busy months. One of the highlight of those months, was work done for Atcom NEXT, a yearly event where we get to show the future of technology put some of our ideas into practice, and create high-profile proof of concepts. There were a lot of innovations that stranded out during the event, the ones I got involved in was everything regarding the mobile applications, the burger ordering system (read below) and the ‘coffee / non ordering system’.

It wasn’t easy, and it required as much critical thinking as it required development. Much work has been done for all aspects of the event, but I would like to spend of little time writing for just two of them. While reading these lines, keep in mind, that the whole event was the product of a team effort.

Small introductory video

Coffee Ordering System

Coffee was being offered to attendees in the form of the coffee being delivered in the attendee by the time he/she approached the bench, without having to order in advance, or wait in the line.

Having already sent physical invitations to attendees requiring the download of the Atcom NEXT application beforehand, we already had the necessary info about each attendee (name, profile image, job position if they logged in using LinkedIn). Inside the “Coffee & Burger” section of the app, the user could set his coffee preference.

When approaching the bench, the attendee would be able to see his face and job profile in a big screen, and at the same time the bartender would receive information regarding the attendee’s face and coffee preference in a dedicated screen, pointed directly towards the bartender side. That way, the bartender can prepare the coffee before the customer actually arrives at the point to make the order.

Thanks to the invitations, we had the attendee’s name, profile picture, and coffee preference. But how do we know when he approaches the bench? How do we trigger the two screens (the front screen and the one which is being used by the bartender) to be updated?

beacon

That’s where beacons come into play. iBeacon devices were placed around the bench in discrete spots. They were configured to transmit using their maximum power, and all had the same UUIDs. That way, the whole coffee bench area was “tagged” as iBeacon enabled, and the power with which the beacons transmitted allowed us to receive a full – powered signal in the mobile devices.

Stronger signal means clearer beacon indications, reduced interference, but also greater range at which the beacons would be picked up by the mobile devices. Which brings us to the most difficult problem: The range from which to trigger the event that the user is approaching the bench. How do we convert the signal strength to distance, and at what distance do we send the “user entered” and “user exited” events to the screens?

We have decided to use the received signal power as a means of figuring out if the user is inside the area and his distance from the signal’s source. We ran many tests and made numerous calculations in order to arrive into a heuristic conclusion as to what is being considered a good distance to notify the bartender, considering that there would be 600 people in the same room and near the beacons, but only a dozen would be waiting to be served a coffee at the same time.

We used background location services on both iOS and Android, but additionally, on iOS we used the Significant Location Change service, in order to start the application in the background when the attendee approaches the venue for the first time, even if the application is completely “killed”.

The application would send events to a NodeJS server, which would in turn stream the “enter” and “exit” events to the bartender screen and the front screen. We used socket.io to deliver the notifications to the two screens in real time.

Phygital Lunch

One of the main attractions of the event was the last part of it, the phygital (Physical – Digital) lunch.

We set up tables with touch screens running Windows, and built an application for windows which would be able create burger orders using a playful interface and send HTTP events to the NodeJS server we had set up for the event.

Once an order was created, it should then be associated with an attendee. Hence, the QRCode scanning at the end of the order. By scanning the order with the mobile device, the Node server would be notified about the order, and the associated user. The Node server would then notify both the people working in the kitchen and the people cooking the actual burgers.

One screen was placed at the grill, indicating roughly how many burgers were about to be given to the kitchen when completed, grouped by type and method of cooking.

Two screens were placed at the kitchen, one for each cook. The Node server was responsible for creating a queue for processing orders. Upon completion of an order, the cook would ask for the next order to be assigned to him for processing. Each cook could only have one active order at a time, and each cook would only see his assigned pending orders, not the other cook’s orders. Therefore, for the Node server we used both an HTTP API and socket.io since it should be possible not only to deliver real time notifications, but also to be able to refresh all screens and continue from the place we left off, in the case of a network failure.

When the cook was done with an order, the attendee would be notified through push notifications that his order was ready.

A demonstration video is shown below.

Conclusion

The difficulty of Atcom NEXT lies in the connection and interaction between the digital and physical realms. During development, we found out the hard way that creating Ambient Intelligence solutions require much more critical thinking than purely digital applications, since real-world scenarios like metal interference and low-connectivity, or unexpected user behaviour can really render any solution useless in a matter of minutes, and ultimately destroy the user experience. I personally remember myself spending entire days on developing aspects of the event, and finally scrapping the results completely countless times before I reached into something that I felt confident it would work without any flaws.

You can find a short presentation made during the event in youtube.

The 2016 Atcom NEXT event was held in 20th of May, in Gazarte, Athens.

https://next.atcom.gr