One of our currently bigger projects is soon going online and one part of it is an API, which only delivers JSON and XML to a lot of possible clients. To speed things up, we use varnish in front of the application (a lot of requests are easily cacheable, since they only change once a day at most). We also use HHVM for some of the requests, for now just the ones which will have many misses (from livesearch requests for example). We don’t dare yet to use HHVM for all requests, we’d like to gather some experience first with it.
This is the last and final part of the blog post series on Multi-Device Interactions. Previously, I outlined the second-screen trend in TV industry (Part 1) and introduced some underlying models in our multi-device world (Part 2).
In this blogpost we (finally) focus on the practicalities of Multi-Device Interaction Design. It indeed has become a challenge for User Experience Designers to develop solutions that account for the multi-device behaviour of today's user. As mentioned earlier, we have developed a canvas to think and design multi-device interactions. The Multi-Device Interaction Canvas (MDIC) is a modifiable and simple canvas to map multi-device use cases. It bases on the theoretical models we presented in previous blog posts.
At its core it respects three important factors:
The first alpha version for the next major PHP release was made available last week. A list of the new features is available at php.net
Liip has a tradition of company-wide gatherings. As a company whose office locations are spread across Switzerland we want to make sure people know each other, and of course we also want to benefit, as a company, from the knowledge we constantly create with our work in Lausanne, Fribourg and Zürich.
One of the formats we find useful is the Techday, a yearly event where we learn about technical or business topics during the day, and celebrate together at night. This time, however, we thought that opening that format up and share forces with our friends from Mayflower could make it even better. They operate in a very similar way as we do, but are located in Würzburg and München, Bavaria.
In our last blog post we started off with John’s story to show the everyday encounter of multiple devices and screens, and outlined the emergence of the second screen business. The classical second screen solution is a companion app for mobile devices that delivers additional information to TV content, e.g. a quiz or sport statistics on your smartphone or tablet. With all the possibilities in a multi-device world, it’s crucial to focus on the conductor of all these instruments - the user! In the following sections we dive into some theoretical models on multi-device interaction.
This article, the first in a series on multi-device interactions, introduces the concept and analyses existing second screen solutions from the broadcasting industry.
Let us start with a (not so) small introductory story (or directly check out the main part).
Today we finally released a stable version of our small tool RMT. RMT (Release Management Tool) is a handy tool that helps releasing software. It allows to create a clean release by running a simple command.
With the recent announcement of Facebook that their HHVM is now more and more compatible with most of the popular framework, I was intrigued to finally try it out. We’re currently building a Symfony2 based application, which has pretty high performance requirements (but we can mostly achieve them with varnish), so I went and did some performance tests on that real-life app.
This weekend I had the opportunity to attend and speak at Symfony Camp UA in Kiev. This event was organized already for the 5th time and draws many developers from the region. While I did spend the entire week before doing an intensive Russian course in Odessa (just for fun, no Liip has no plans to open an office in Russia), I did my talk on REST in English. Next to mine the only other talk in English was delivered by Pawel, who spoke about Sylius. It was great to finally meet Pawel in person! At any rate all other talks were in Russian. It was semi possible to follow the Russian talks if the slides contained enough code but in the end I spend most of the time talking to people sitting on the comfy chairs in front of the conference room. There were two questions that came up multiple times, so I figure I also answer them quickly here.
This is a guest post by Tim Bezhashvyly.