About Lucas Challamel

Novelist, Aut or of "The Yellow Stamp"

Happy 2015!

Happy New Year 2015!

Happy New Year 2015!

New years celebrations in Sydney have been amazing, starting with the now iconic fireworks on the bridge:

Wishing all my friends and fellow web professionals out there a very happy and successful year!

And more Radio Meuh for all please …

Radio Meuh

MWD0701: Log Management with ELK

elk_logo

In our series around modern web development, I’d like to touch on a vital component in the production pipeline, sitting in the area of debugging and monitoring (the MWD07 chapter), and that is Log management. Too often is this overlooked by most seasoned developers and dev managers, and that’s a real shame, because at all stages of the application life cycle Logs are a goldmine!

Obviously first and foremost for debugging purposes, at development and testing stages. But also later on, and once the application is in production, for performance monitoring, bug fixing purposes, and simply for usage analytics. There are a lot of logs available in a web stack, not to mention those that you will create and populate ad hoc for the verbose logging and overall auditability of your application: System logs, web server logs (access and errors), database logs, default framework-level logs (such as those you’ll get in Zend framework or Symfony for instance in the PHP arena), postfix and other mail logs, etc. All these deserve proper handling, rotation, storage and data-mining.

In my past life in agency-land, I had the opportunity to play with a variety of web log analysers such as AWStats, Webtrends and alike. I also used with reasonable success the community version of Splunk, and back then it seriously helped tracing back a couple of server hacks, but also providing custom stats around web campaigns to hungry marketers.

Now that I am working on one main web application with my current employer, I have been looking for a robust and sustainable solution to manage logs. And while looking along the lines of Logstash, a tool I used previously for a Java platform, I have discovered the new comprehensive solution now known as the ELK platform.

ELK stands for Elastic Search + Logstash + Kibana

Elastic Search has been around for a while, as a real-time search and analytics tool based on Lucene. Recently funded with a $70M C-round (press release), the company has undertaken the ambitious “Mission of Making it Simple for Businesses Worldwide to Obtain Meaningful Insights from Data”. Nothing less.

Logstash is this nice piece of software started 5 years ago, and maintained since then, by Jordan Sissel, a cheerful fellow developer also guilty some other nice nifty little utilities, such as the hand FPM. Logstash helps you take logs and other event data from your systems and store them in a central place. It is now commercially supported by ElasticSearch and Jordan Sissel has also joined the team.

And finally Kibana is a web fronted to visualise logs and time-stamped data. Produced by the vibrant Logstash community, and contributed in particular by early committer Rashid Khan, it is now commercially supported by Elastic Search as well, as the preferred visualisation and washboarding tool for Logstash and Elastic Search.

ELK_platform

So how does it work? Well the diagram above will give you the gist of it:

  • Logstash processes log files as inputs, applies codecs and filters to it (note the amazing Grok library used as a middleware for reggae patterns) and spits out output files, including specific support for Elastic Search.
  • Elastic Search consumes Logstash outputs and generates search indexes.
  • Kibana offers the user-friendly interface anyone expects to build business-enabling reports and dashboards.
Sample Dashboard in Kibana 3

Sample Dashboard in Kibana 3

To get the full picture of the solution, there’s probably no better preacher than the creator himself, Jordan Sissel, who has been a faithful contributor at PuppetConf for the last 3 years, check out these Youtube recordings:

Useful links:

MWD0201: Setting up a Mac for development (update)

A few months ago, I had a first crack at this topic: How to set up your Mac for modern web development. If you are curious enough, you’ll find the blog post here. 8 months after, I have taken a few things on board, and I believe time has come for an update.

The full step-by-step document is available as a PDF attached (SettingupaMacforDevelopment_v1.1), but to summarise my take on this topic:

  • You need some basic utilities: OS enhancements, editors, network utilities.
  • You need Homebrew, the missing package manager for Mac OS X. And thanks to that, you will be able to install all the languages and tools you need
  • Finally you need the DevOps tools required for modern automation and deployment practices: VirtualBox, Vagrant and Docker

mac_setup

Once this all settled and dusted, you will be able to run a state of the art web development environment on you Mac, on a day to day basis.

With handy shortcuts defined in your .bashprofile file, you will be able to start and stop services as we need them. A typical list of aliases would be:

#adding aliases

# PHP-FPM commands

alias php-fpm.start=”launchctl load -w  usr/local/opt/php55/homebrew.mxcl.php55.plist”

alias php-fpm.stop=”launchctl unload -w /usr/local/opt/php55/homebrew.mxcl.php55.plist”

alias php-fpm.restart=’php-fpm.stop && php-fpm.start’

# MySQL commands

alias mysql.start=”launchctl load -w /usr/local/opt/mysql/homebrew.mxcl.mysql.plist”

alias mysql.stop=”launchctl unload -w /usr/local/opt/mysql/homebrew.mxcl.mysql.plist”

alias mysql.restart=’mysql.stop && mysql.start’

# PostgreSQL commands

alias pg.start=”launchctl load -w /usr/local/opt/postgresql/homebrew.mxcl.postgresql.plist”

alias pg.stop=”launchctl unload -w /usr/local/opt/postgresql/homebrew.mxcl.postgresql.plist”

alias pg.restart=’pg.stop && pg.start’

# NGINX commands

alias nginx.start=’sudo nginx’

alias nginx.stop=’sudo nginx -s quit’

alias nginx.reload=’sudo nginx -s reload’

alias nginx.restart=’nginx.stop && nginx.start’

alias nginx.logs.error=’tail -250f /usr/local/etc/nginx/logs/error.log’

alias nginx.logs.access=’tail -250f /usr/local/etc/nginx/logs/access.log’

alias nginx.logs.default.access=’tail -250f /usr/local/etc/nginx/logs/default.access.log’

alias nginx.logs.default-ssl.access=’tail -250f /usr/local/etc/nginx/logs/default-ssl.access.log’

alias nginx.logs.phpmyadmin.error=’tail -250f /usr/local/etc/nginx/logs/phpmyadmin.error.log’

alias nginx.logs.phpmyadmin.access=’tail -250f /usr/local/etc/nginx/logs/phpmyadmin.access.log’

# WebDEV shortcuts
alias webdev.start=’php-fpm.start && mysql.start && nginx.start && mailcatcher’
alias webdev.stop=’php-fpm.stop && mysql.stop && nginx.stop’

To conclude, the most important thing is to keep your webdev environment up to date on an ongoing basis.

Mac OS X updates

Visit the AppStore to check for OS level updates. Pay particular attention to XCode updates.

Homebrew updates

All brew commands are here: https://github.com/Homebrew/homebrew/tree/master/share/doc/homebrew#readme

List the installed packages

$ brew list

Update the formulas and see what needs a refresh

$ brew update

Now upgrade your packages, individually or as a whole:

$ brew upgrade

If your paths and launch files are set properly, you should be fine even with an upgrade of PHP, MySQL, Nginx or NodeJS.

Pear updates

Simply run this to get a list of available upgrades:

$ sudo pear list-upgrades

And then to implement one oft hem

$ pear upgrade {Package_Name}

Gem updates

All Gem commands are here: http://guides.rubygems.org/command-reference/

List the installed packages

$ gem list

List those needing and update

$ gem outdated

Then update gems individually or as a whole:

$ gem update

Node updates

Note itself should be updated with Brew on a Mac.

$ brew upgrade node

To update Node Package Manager itself, just run

$ sudo npm install npm -g

To list all packages installed globally

$ npm list -g

Check for outdated global packages:

$ npm outdated -g –depth=0

Currently the global update command is bugged, so you can either update packagers individually:

$ npm -g install {package}

Or run this script

#!/bin/sh

set -e

set -x

for package in $(npm -g outdated –parseable –depth=0 | cut -d: -f2)

do

npm -g install “$package”

done

Note that all global modules are stored here: /usr/local/lib/node_modules

Conclusion

Obviously this is a personal flavour which characterises web development based on PHP, MySQL and NodeJS. For other destination ecosystems (Java, Ruby, Python), you can probably adapt the documentation above to fit your needs and specific constraints. But the main ideas remain: Use Homebrew, Ruby Gem, PHP Composer and Node NPM as much as you can to install additional libraries and manage dependencies.

Other tools I may have covered are a log management platform (such as Splunk or ELK), error catching (such as Sentry), mobile application utilities (such as Cordova, Ionix, Meteor), or design utilities (such as Omnigraffle, Pixelmator, Sketch, Mindmapple). Not to mention a variety of handy cloud services.

Please let me know what you guys out there think about this!

A star is born … well more exactly a Meteor! (v1.0.2 is out)

meteor_logo

Meteor was recently released in its official version 1.0, and this has been long expected by its community of early adopters. If you don’t know what Meteor is, rush to the website https://www.meteor.com and see by yourselves.

In a nutshell Meteor is a new, but very well-funded and production-ready, player on the scene and is one of the few frameworks that takes full-stack approach. Your app runs BOTH on the server and the client (in NodeJS on the server, and in your your browser’s JavaScript engine on the client) and works very holistically together. It also comes bundled with MongoDB (although you can replace this with a bit of tinkering).

Everybody knows Meteor uses NodeJS behind the scene. But does it use NodeJS version in your PATH? Hmmm…. No. Meteor is ultra portable and the developer does not need to know about NodeJS at all. So when you are installing Meteor, it will download something called dev_bundle which has NodeJS and all the NPM modules needed by Meteor. All these modules are pre-compiled for your platform. That makes getting started with Meteor easier and quicker. Is there any problem with this approach? No. This is perfect, you just need to be aware of it, especially if you are planning to bundle several apps.

So why should you consider coding your next web app using Meteor?

  1. Your app will be a real-time one by default, thanks to the power of web sockets through NodeJS
  2. Just like in NodeJS you can code the full stack with just one language: Javascript
  3. You can save a lot of time with smart packages grabbed from the AtmosphereJS site
  4. The community is extremely supportive, and the company very well funded  (Read this)
  5. It’s optimised for developer happiness, and it’s friendly for beginner developers
  6. It inter-operates nicely with other JS libraries such as AngularJS, Famo.us, and more.
  7. It’s clearly ahead of the technical curve, and that reads through their mission statement: “… to build a new platform for cloud applications that will become as ubiquitous as previous platforms such as Unix, HTTP, and the relational database.”

Meteor 1.0

In conclusion, Meteor is extremely interesting and I think they do a lot of things very right – it’s a delight to work with. EVERYONE coding JavaScript should learn it, because it’s proposed the right way, full-stack. But it’s only an option if you’re in the position of replacing your entire stack, client and server (or working from scratch of course). If you already have, say, a web API that you work against, of if you have an existing JavaScript frontend app that you just want to add some structure to, it won’t fit your needs. Then you would probably consider a more versatile approach with ExpressJS as a NodeJS framework and Ionic as a mobile app packager (which I will cover in another post)

Useful links for Meteor resources

MWD03 – Provisioning a local development stack

In the previous post, we set up the Mac workstation and got it ready for modern web development.
In this chapter, we’ll discuss the next key step in setting ourselves up the right way to develop a web application, and this is about creating and provisioning a development environment.
Using Linux is not a crime!
VMs are fantastic
If you are planning to create your app using PHP, Java, Python and or Ruby, then there are 90% chances you will do that on a Unix/Linux powered stack. Otherwise, you would go for Windows, and anyway things would not be very different.
Before we throw money through the window renting a server on the cloud, let’s be practical and consider the most obvious option, which is to leverage your own local workstation to setup a virtual environment. Note again that I advise against using platform ports of xAMP (Apache-MySQL-PHP) and there are a few good reasons for that, along the lines of consistency:
  • Operating system discrepancies (starting with file systems)
  • Software versions
  • Files and folder permissions
  • Stability
This said, the best thing to do is to provision a virtual machine which replicates as closely as possible the target production environment. For this end, we use a virtualisation platform like Virtual Box, as proposed in the previous article, and can install with it any preferred OS stack. Let’s assume a CentOS 6.5 64bits for the example, but it could be anything else, including a custom and home brewed VM.
Fortunately for us, instead of downloading the ISO at cents.org, and going through the full install process, ready made boxes are available on the web, and I can mention the following repositories:
My Vagrant is rich
My Vagrant is rich!
Vagrant is an amazing, accessible and free utility, and I hardly see how the modern web developer could ignore it. It allows them to create and configure lightweight, reproducible, and portable development and staging environments, with the exact combination of services and utilities I need for my project. And obviously I will consistently use and refer to Vagrant hereafter, as I am now using it both in my hobbyist and professional lives.
The basics for Vagrant are very well explained on the official site, here: http://docs.vagrantup.com/v2/getting-started/index.html
To install Vagrant, just visit the page and download the right package for your OS: http://www.vagrantup.com/downloads.html
Done, we are ready to provision our Linux stack for development (and possibly staging) purposes:
As I am a RedHat/Fedora/CentOS enthusiast, I go for a CentoS 5.5 64 bits stack, which I pick from the shelves of Vagrant Cloud (but could have been anywhere else): https://vagrantcloud.com/mixpix3ls/centos65_64
This one has been setup with the Virtual Box Guest additions (an I have a specific short article to help you out with upgrading your VB guest additions in case you update Virtual Box).
Let’s first create a working folder:
     $ mkdir ~/sandbox/my_project
     $ cd ~/sandbox/my_project
Now I initialise my local Linux stack:
     $ vagrant init mixpix3ls/centos65_64
=> This immediately create a local .Vagrantfile in your project folder, which you can freely edit and tweak to suit you need, as we will see later.
One thing you might like to immediately do though is to organise proper web port forwarding by inserting the following line in this .Vagrantfile:
 
config.vm.network “forwarded_port”, guest: 80, host: 8080
As you understand, this .Vagrantfile should later on be part of your GIT repository, as a way to share with your fellow developers what sort of server environment your app is supposed to run on.
For now, let’s just switch the Linux machine ON:
     $ vagrant up
 
It will take some time to download the gig of data corresponding to the image, just look at progress in your terminal window. But eventually your VM is up and running, and you can seamlessly SSH into it using the simple command:
     $ vagrant ssh
 
3 commands in total: Isn’t that amazingly straightforward?
In case you wonder where all this magic happens, beyond the 5kb .Vagrantfile stored in the project folder:
 
This is great but still a little bit bare bones for a web server, and we now have to continue provisioning the image at least with a web server and a database server.
Do you manual?
Do you manual?
The old dog and control freak in me can’t help, at least once, to do it the hard manual way, and that’s what it looks like:
$ sudo yum update # Careful because a Kernel update may ruin the Virtualbox Guest Tools
$ sudo yum install ntp
$ sudo service ntpd start
$ sudo rm /etc/localtime
$ sudo ln -s /usr/share/zoneinfo/Australia/Sydney /etc/localtime #properly set server time to this part of the world we love
$ sudo yum install nano # I need my fancy text editor
$ sudo yum install httpd # This is Apache, and you could choose Nginx or Tomcat
$ sudo service httpd start
$ sudo chkconfig httpd on
$ sudo nano /etc/httpd/conf/httpd.conf # Change Admin email
$ cd /etc/httpd/conf.d
$ sudo mkdir vhosts
$ sudo yum install php php-mysql
$ sudo yum install php-* # all extensions, because I can’t exactly tell which ones I need for now
$ sudo nano /etc/php.ini # Change Memory parameters
$ cd /var/www/html
$ sudo nano /var/www/html/phpinfo.php
$ sudo tar -jxf phpMy*
$ sudo rm phpMyAdmin-4.1.13-all-languages.tar.bz2
$ sudo mv phpMyAdmin-4.1.13-all-languages/ phpMyAdmin
$ cd phpMyAdmin
$ sudo find . -type f -exec chmod 644 {} \;
$ sudo find . -type d -exec chmod 755 {} \;
$ sudo mv config.sample.inc.php config.inc.php
$ sudo nano config.inc.php # Change blowfish secret
$ sudo rm -R setup
$ yum install php-mcrypt # these 2 lines are necessary to install MCrypt, and be able to deliver some reasonably serious work around cryptography
$ sudo yum install mod_ssl
$ sudo service httpd restart
$ sudo yum install mysql mysql-server
$ sudo service mysqld start
$ sudo mysqladmin -u root password ******
$ sudo chkconfig mysqld on
$ sudo yum install git
$ sudo nano /etc/conf/con.d/my_website.conf
$ sudo service httpd reload
Now just hit
=> http://127.0.0.1:8080/phpMyAdmin/ to access MySQL via PHPMyAdmin
=> http://127.0.0.1:8080 in your local browser, and you should see the magic happening, with the default Apache page.
It does not seem much, yet you might quickly spend 30min+ going through the above, if everything goes right and you do not inadvertently commit any typo.
Plus what you’ve just done, your fellow developers in the team would have to do it as well, as they VagrantUp their own local development environment: This is of course not acceptable.
One obvious and immediate countermeasure is to share the custom BOX we’ve just created with our peers via network or cloud storage. This is well covered by Vagrant here: http://docs.vagrantup.com/v2/boxes.html . Your peers would simply have to use the box add command:
$ vagrant box add my-box /path/to/the/new.box
$ vagrant init my-box
$ vagrant up
However, this is still a bit brute force and not fully flexible and future proof: What if I realise I need to add a specific service or configuration? I would have to update my box and to copy it again over the network for my peer developers, and moving around 1Gb of data is not the most elegant thing to do, is it?
Therefore we are looking for a more scripted and flexible way to provision our Linux stack on the fly.
In my next article, I will discuss a couple of simple enough yet professional solutions to provision your development environment in a robust and agile manner using either Chef or Puppet.
Next to MWD04 – Provisioning the stack with Chef

Link

Code Academy

Codecademy is an education company. But not one in the way you might think. We’re committed to building the best learning experience inside and out, making Codecademy the best place for our team to learn, teach, and create the online learning experience of the future.

Link

Lynda.com

lynda.com is an online learning company that helps anyone learn software, design, and business skills to achieve their personal and professional goals.

MWD02 – Get the MAC workstation ready

Continuing this series about Modern Web Development!
As a non Microsoft Developer working on a Mac, you need to get yourself a few productivity tools which unfortunately are not all available out of the box on your Macbook or iMac.
STEP 1 – Starting with a few Mac OS X enhancements
Total Finder Features
The Finder: Even with the latest OS X 9 Mavericks update, the Finder still has some shortcomings, and in particular still lacks the ability to quickly switch display of hidden system files. My recommended solution for that is to switch to TotalFinder  that is US$15 well invested for your productivity. And it’s more lightweight and stable than the very ambitious Pathfinder (US$39)
Total Terminal
The Terminal: The native Mac terminal is OK, but not always easy to summon in context. To improve usability in that space, I have taken 2 actions: First, I have downloaded the excellent and free TotalTerminal utility from the same BinaryAge company publishing Total Finder. With its Visor feature, you can always summon the Terminal in context using CTRL+~, very handy. And secondly, I am leveraging the new service available in Mavericks which allows to open a “new Terminal tab at Folder”, as per instructions found here. Right-click any folder, select Services>Open Terminal, and here you go. A true game changer.
Textmate 2
Textmate 2A decent Text Editor: Textedit is definitely outdated, and I wonder when will Apple decide to give it a bit of attention span! In the meantime, the modern developer will invest in a Sublime Text or Textmate 2 license. My preference goes to the latter which has a much more comprehensive syntax library.
Other CLI developer tools:
Mac OS X native tools: First we need Mac native tools, which are not installed by default but usually come with Xcode. If you are not planning to develop Mac or iOS apps and do not need Xcode just yet, these tools can simply be installed by using the following command in Terminal: $ xcode-select —install or even simply $ gcc
This will trigger a dependency pop-up indicating that Xcode requires the CLI developer tools and offering to install them: Just click install and wait for 30min.
CLI Developers Tools
Java: Java is an important language, and unfortunately it is not available natively anymore in the latest releases of Mac OS X.
The current Java version recommended by Apple is 1.6 and it can be downloaded here.
If for some reason you need to work on Java 1.7, visit the official Java website, at your own risk:
Ruby and Python: Ruby and Python, two immensely popular object-oriented scripting languages, have been installed as part of OS X for many years now. But their relevance to software development, and especially application development, has assumed an even greater importance since OS X v10.5. And as a modern developer, in particular in the PHP space, you are likely to use Ruby scripts for automation and deploy on, with Composer, Capistrano and Chef for instance.
Ruby with HomebrewHomebrew  s a package manager for Mac and it helps installing additional services and utilities, that Apple didn’t, in an organised manner, managing symlinks into the /usr/local folder. Just like CHEF consumes recipes written in Ruby forserver provisioning, Homebrew consumes Ruby-written formulas to install software packages on your Mac.
To install Homebrew (Note a dependency on XQuartz)
$ brew doctor
To install Ruby with Homebrew:
$ brew install ruby
To manage the different versions of Ruby you may have on your system, use RVM:
$ \curl -sSL https://get.rvm.io | bash -s — –autolibs=read-fail
PHP Storm
STEP 2: Get a good IDE
PHP StormThere are several options on Mac and I can mention here the well know Eclipse (All languages), Coda 2(PHP), Netbeans (Java, PHP, C++) or Aptana Studio (PHP, …)
But my preference now clearly goes for the great and consistent suite of IDEs offered by JetBrains for the different languages:
  • IntelliJ for Java
  • PHPStorm for PHP => That’s the one I am mostly using (US$99 for a personal license)
  • Web Storm for general purpose HTML/CSS development
  • PyCharm for Python
  • Rubymine for Ruby
  • Appcode for iOS development
Atlassian Sourcetree
STEP 3: Source control with GIT
GIT is essential and mandatory, and what you need is the holy trinity: The command line on your workstation, a repository storage service, and a GUI client.
Mac OS X 10.9 comes with GIT 1.8.5 bundled, which is not the most recent version.
To upgrade GIT on Mac, the official GIT site proposes an installer.
Another option is to use Homebrew, with:
$ brew update
$ brew install git
This will get you the latest GIT, version 1.9.2 as I write these lines.
.
Source TreeIn terms of GUI, I enthusiastically recommend Atlassian Sourcetree, which by the way will come with its own version of GIT bundled in, for self-consistency purposes. (GIT 1.8.4 for Sourcetree 1.8.1). The desktop client for Mac proposed by Github is also a decent one.
.
Lastly, make sure you have an account on Github, offering public repositories only for the free account. For this reason, I urge everyone to consider Atlassian Bitbucket, which offers free accounts with illimited private repositories for teams of up to 5 members: It can’t get any better!
.
TransmitSTEP 4: Other essential network utilities
.
For FTP/SFTP and S3 access, I have found nothing better than Transmit 4 from Panic Software (US$34).
If you are after a completely free software though, you’d would fall back to Cyberduck 4. And finally, Filezilla has now enjoyed a decent port for Mac, but unfortunately lacks S3 support.
.
NavicatFor database connectivity, the “ned plus ultra” solution is Navicat  which offers connectivity to MySQL, PostgreSQL, Oracle, MS SQL Server, SQL Lite, MariaDB. Choose the Essential edition for US$16 only, or go pro with Navicat Premium ($239).
If you use exclusively MySQL, then Sequel Pro is the perfect and free solution. Likewise for PostgreSQL, simply use the Postgres.App.
.
.
Charles ProxyAlso, you need a web debugging proxy and the obvious choice here is Charles 3.9 (US$50).
Interesting to note though that Telerik is currently porting the remarkable Fiddler to Linux and Mac (Apha release).
.
STEP 5: Virtualisation software
First and foremost, I would strongly advise against setting up localk xAMP suites, such as MAMP and XAMP for Mac. It may look handy at first glance to run your Apache and MySQL servers locally, but it quickly becomes a pain in the back with compatibility issues, when you start using PHP and Apache extensions. Very messy and inaccurate.
.
Virtual BoxAccording to the idea that there is “nothing like the real stuff”, I urge any serious developer to work with locally virtualised environments replicating LIVE destination environments, whatever the operating system.
The best choice is to use Oracle Virtual Box, free and well supported.
Now you can manually install any of your favourite OS, and set your code folders as shared mount points.
.
VagrantBut the latest delightful trick to do so is to use Vagrant to flick up an environment in 1 command line and no time:
Download Vagrant for your system, and then in the terminal simply VargrantUP! You can select an image on Vagrant Cloud: https://vagrantcloud.com/discover/featured
$ cd /~/my_dev_folder // Putting yourself in the folder you intend to store your web app
$ vagrant init hashicorp/precise64  // Installing a standard Ubuntu 12.04 LTS 64bits box
$ vagrant up
$ vagrant ssh // You are logged in!
In just a few minutes, you are logged into your VM and can start setting up the services you need fort your app, starting with Apache, MySQL, PHP and more.

From Mobile First to Context First

Summary: As we are stepping into the “delight economy”(10), in which you aim at turning first time visitors and adopters into lifetime consumers and brand advocates, user experience and engagement are essential. And if device awareness was a vital step to take, to deliver optimised user interfaces, it was a baby one, we are still to embrace more broadly the necessity of context awareness, in our online presence and web services, to maximise engagement and conversions.
Mobile is dead

I read this interesting article on the blog of SDL  the software company behind the enterprise CMS Tridion  The title was purposefully provocative: “Mobile is dead” (1).

Making a point around the fact that it’s great for a brand to have a mobile friendly public website, a couple of mobile apps for iOS and Android, mobile responsive landing pages, and even a mobile first strategy and fully device aware web services… but it’s just the very first step in a more ambitious journey towards the user, his needs and his context.
The 5W questions
From Mobile First to Context First” to summarise it in a concise formula, after Peter Sena from Digital Surgeons (2), and if you google it, you’ll quickly grab an eclectic selection of bookmarks on the topic, walking you through user centred and context driven design, and leading you onto the adventurous ground of page less design (3), smarter websites and digital storytelling (4) in the delight economy (5).

This is all very exciting and promising. It explains why leading CMS vendors like SDL, Sitecore and Adobe are fiercely competing on this new segment of CXM, or Customer Experience Management, seeking the Grail of ultimate context-based personalisation. And it also justifies the rapid emergence of all these startups focused on analytics and performance monitoring: Crazy Egg  Qualaroo  OptimizelyNew RelicFuel Deck, etc …

Yet, I can’t help wondering if all this technology is going to help me actually getting right my “context first” strategy without a radical change of perspective on the online presence of my company and my brands? Measuring variety and complexity is great to understand it and gain insights, but it can be outrageously onerous to action, as any digital marketing manager can tell. And even so, once all is measured and charted, can that really help me overcoming complexity, reaching this state of simplexity (6) I am looking for, and starting to produce delightful experiences which will turn my visitors into lifelong consumers and advocates?

Context
The 5W questions
Maybe this needs a whole new view on what a website should be. Context First is a good starting point, asking the essential 5W questions of Who, What, When, Where and Why. So let’s discuss how web technologies can help us in answering these questions. We’ll discover that we are not necessarily talking about million dollars software here, but most often simple native browser and network capabilities.
Who: Let’s clear up this one, which is pretty obvious. Unless you have NSA clearance, there’s no way you can tell and guarantee who is visiting your site. Cookies, and in particular those from social networks offer some interesting footprint, unless you explicitly ask your visitor through a form or a call to action
When, Where: These are the easy ones, thanks to the native capabilities of browsers and devices and web servers. We can even know “Where from…”, through the the HTTP referrer parameter, which most often provide invaluable pieces of context.
Why: A subtle question which, outside of an explicit call to action, can only be dealt with in a predictive manner, by matching the known session parameters with preset personas and scenarios.
 
The 3S: Search, Social, Spam
In fact, direct or bookmark access has decreased over the past few years, as most of the traffic comes from either Search engines (primarily Google), and increasingly from Social networks (primarily Facebook). According to recent studies, and depending on industry verticals, between 40% and 60% of the traffic comes from organic or paid search, and around 20-25% comes from social and referrals, leaving only 20%-40% to direct visits. (7) (8) Apart of this last chunk would be generated by another powerful source of traffic: Email Direct Marketing, EDM or “Spam” for the sake of pulling together our 3S: Search, Social, Spam.
So WHY are visitors coming to my website? Because they’ve been searching for something, because a friend has shared an interesting information with them, or because they have received a call mail call to action. In all 3 situations, your website should be able to capture this information, and pull together some context for personalisation.
What: This question 2 fold, as it involves the visitor but also you as a publisher. What is the visitor looking for, and what am I able and willing to propose to him or her. Back to the WHO question, this can only be determined through proactive guess working, via carefully designed calls to action in my landing page.
The 3 clicks rule for engagement
Obviously, there will always been situations where you can’t collect much about your direct visitor, due to their privacy settings for instance. Therefore you’ll have to get the best out of their first few clicks and interactions to profile them and tailor their experience. As a rule of thumb, and beyond the myth (9), aim at locking WHO they are and WHAT they are after in a maximum of 3 clicks, in order to serve them best informed content from the 4th one and onward, and produce delight.
Context First
Conclusion
So what about Mobile and browsing devices and software in general: Does it really matter that much? It certainly does from a usability perspective and we ought serving the most suitable touch UI to visitors on smartphones and tablets. Again, it’s about generating engagement through delight, via the best use of available capabilities.
But beyond that, talking context? I can browse with a smartphone on the couch, with a tablet from work, and with a laptop from a coffee shop terrace. Does the device really determine my context? Probably not, at the end of the day it’s mostly a game of probabilities .
Device awareness was a vital step to take, to deliver optimised user interfaces, but it was a baby one. Context is everything and web publishers are urged to build context aware web services, beyond responsive or adaptive brochure-ware websites, there’s a real need for something like smart agents to acknowledge and properly greet and guide visitors, and leave them with a memorable first impression, just like I’m a brick and mortars experience. This broader “Context First” perspective could bring within the next few months some radical changes in the way we even think online presence, and trigger a massive upgrade of current brochure-ware websites into a new breed of smart web agents. To be continued along these lines.
References:

 

Digital engagement in the post-Flash era

Flash is dead, long live the rich web! Beyond the now very conventional statement, it is about time for digital marketers and creative developers to acknowledge the reality of a “yet again” moving landscape, and to understand what is the new winning way of digital engagement, in a multi-screening world. Strategists, UX specialist and creative technologists are now busy scouting and monitoring the most innovative trends, at their bleeding edge. Here are some of their findings in the field of cross-browser and cross-device support, mobile applications and gamification. Welcome to the post-flash era.”

About one year ago, I published this private blog post, about the end of Flash and the emergence of new ways to engage the audience in the multi-screen world. For the records, I have copied a moderated version of the full article below, but in a nutshell what I was saying back then is the above.

The multi-screening world was, and still is a reference to the famous report from Google available here: http://www.google.com.au/think/research-studies/the-new-multi-screen-world-study.html

Google Study - The Multi-Screen World

In another article published in B&T in November 2013, I also mentioned the fact that the future of digital engagement would largely rely on richer and interactive video and 3D/immersive experiences, and I shortlisted Rapt Media as a promising new contender in this arena.

Erika TrautmanThis said, it was only half a surprise today to read this article on The Next Web:

RIP Flash: Why HTML5 will finally take over video and the Web this year“, by Erika Trautman, CEO of Rapt Media.

And I can only strongly agree to the invoked key factors: Mobile and semantic markup for SEO and social.

Well done Erika, now following you on Twitter to learn more about your vision of the rich and semantic next web 🙂

 

## Moderated Article – August 2013 ##
 
Image
Flash is awesome … or at least it used to be consensually rated so, referring here to good old Macromedia Flash, then turned into Adobe Flash. I used to have significant Flash staff by my side across my career, and these hyper-specialist developers were high on demand, at least until 3-4 years ago. As of today though, Flash developers are busy fast tracking their skill set diversification and looking at other languages, as the demand and expectations from customers evolve.
Not only that I stop recommending flash, but more because of a reduced market’s appetite. Not because of a mere commoditisation process, as this usually happens in the software and service industry, but because of more disruptive changes. And if the well known enmity with the Apple devices ecosystem is commonly invoked, I believe the sunsetting journey of Flash has been accelerated by powerful and conjugated groundswells in other areas, namely broadband, HTML5, social as in user generated content, analytics, and SEO. This is probably what recently led large thought leaders such as Microsoft and Google to drop Flash support in their platforms.
“What a shame!” will the Flash enthusiast say, when you think of all the incredible features it was bringing to our desktop screens: Vector based lightweight graphics, HD audio and video streaming, microphone and webcam access, low-level bitmap manipulation, hardware accelerated 2D/2.5D animations, plus a strongly type and class-based programming language and binary-based sockets.
Note 1: as of August 2014, on seek.com.au, there were 73 job ads for Flash developers, to compare with 453 for HTML5 developers, and 823 for mobile developers. 
Image
Well, one can bluntly reply that HTML5 is now competing, and even beating Flash in most of these areas, and if we used to say that HTML5 was a standard in motion, this is now an overstatement since the HTML standard will be final by the end of 2014 (source W3C).
Take animations for instance, and look at what you can achieve with CSS2 and CSS3 on all mainstream browsers. Beyond the confidential and exotic Chrome Experiments, heavyweight Adobe has already acknowledged that through its HTML roadmap and the new EDGE suite of HTML tools. And that’s without also mentioning the capabilities of the SVG vector graphic format.
In the media arena, the native VIDEO tag now allows to play back of H.264 encoded MP4 video in all modern browsers, including Microsoft IE9 and 10, without a plugin. But there’s more, as today the Chrome and Firefox browsers natively offer even access to the user’s microphone and webcam through a recent Media Capture and Streaming recommendation (getUserMedia). And to nail it down, there is an open source project called WebRTC  enabling real time video communications through a Javascript library already supported by Chrome and Firefox as well.
With CSS Filters 1.0 around the corner for dynamic bitmap manipulation (check out this nice demo), and ECMAScript 6.0 bringing more robustness and concision to Javascript syntax, the sky seems to be the limit.
Over the last few months, I have carefully observed these developments and trends, in order to make informed decisions regarding the selection of new technology stacks. And I can mention and share my findings in 3 specific areas:
Image
Cross-browser support and device adaptation
This used to be a strong Unique Selling Point for Flash, given the massive penetration and ubiquitous character of its browser’s plugin. However mobile has been a game changer, with an unprecedentedly large number of screen sizes and aspect ratio combinations, in front of which scaling up and down graphics is just not enough. And a promotional banner or interactive piece of content will most likely look a bit odd or even not usable at all, once shrunk within a 4” screen.
In the meantime, modern HTML has enabled consistent and adaptive handling of rich content across multiple browsers, and when new HTML5 and CSS3 features are not enough, then Javascript and server-side browser detection and processing help bridging the gaps. Don’t be mistaken though, this does not mean that the browsers war is over, and the market is still shared by 4 vigorous contenders, if to just consider desktop and laptop browsers only. And if Google Chrome is now significantly leading the pack (see the latest Statcounter figures for instance), Microsoft Internet Explorer, Mozilla Firefox and Apple Safari are tagging along (The same situation prevails globally on Mobile).
When I hear the word of “fragmentation” regarding browsers, this is certainly a reality that still needs to be addressed, since support of the most recent HTML norms and features stays diverse and incomplete. In that regard, I recommend to bookmark 2 useful web services:
  • http://html5test.com allows you to measure the compliance of your browser with the HTML5 standard as a whole. On a scale of 0 to 500, without surprise Chrome leads with 463 pts as of July 2013, versus only 320 for IE9 and IE10.
  • http://caniuse.com provides an extensive list of HTML5 + CSS + Javascript and more features that you can check versus the particular browsers matrix you are targeting for your ongoing projects and customers.
To circumvent the challenge of outstanding inconsistencies between browsers versions and screen sizes, the only valid answer is abstraction. This is what so called HTML5/Javascript frameworks are proposing and this is why they have grown more and more popular over the last couple of years, propelled by buzzwords such as “Responsive Design” or “Device Agnosticity” for instance. My goal here is not to establish an endless list of all initiatives and software in this area, but to give a sense of why I have picked some development tools. Just to mention my mainstream options:
 
Image
Mobile applications
We all saw how Apple waged war to Flash, excluding to support it on iOS devices on the basis of stability, security and performance issues, and despite some support found at Google within its Android system, this has certainly triggered the downfall of the “15 years old patriarch”. Lately, I have followed noticeable attempts by Adobe to regain ground on iOS via its Flash+AIR platform, but the mass seems over, with Google giving another lethal blow by dropping Flash support in Android 4.1 and beyond.
Australia certainly is a particular market with a persistent domination of the mobile market by Apple, as per latest StatCounter and Google stats  Nevertheless, the trend stays the same and other contenders are cracking in, Blackberry still alive in the B2B segment and Nokia/Microsoft nibbling the heels of Google Android. The days of iOS only native applications are long over and any serious digital marketer knows that he needs to provide a slick mobile experience on key dominant and fast growing platforms to grab the laurels.
And if you can often get away with a neat mobile web experience for fulfil the needs of a branding site or a marketing campaign, there are still some strong and relevant use cases for native mobile applications, and at least the following:
  • File system interactions on the device: Manipulating and exchanging documents back and forth between local storage on devices and cloud services.
  • Media interactions: Leveraging the camera and the microphone is better done within a native and locally running app. Take a simple barcode/QRcode scanning feature for instance, but consider also the engagement potential of Augmented Reality.
  • Gaming: Despite the ubiquitous broadband network, there are still situations when the user is off the grid, the game assets are too heavy, or the user experience cannot suffer from unreliable streaming.
  • Interoperability: Users have more and more appetite to multi-task and move content across multiple feature centric apps on their device, regardless of network conditions and real time access to cloud services.
  • Encryption: With recent coverage in the news of massive privacy infringements, not only by hackers but also by government services, strong data encryption is essential, starting with the Finance vertical.
It is a daunting challenge for a digital professional to maintain skills and know how across the 3 short tail mobile stacks, namely Apple iOS, Google Android and Microsoft Windows Mobile. For this reason, I have chosen to favour cross-platform development, leveraging frameworks such as XamarinTitanium Appcelerator and Adobe Phonegap .
Image
Gamification
This also used to be an arena dominated by Flash, and it is obviously still very high on Adobe’s latest product roadmap , as this might ultimately be its best survival option. But the rise of casual gaming in Facebook and on mobile phones, has accelerated the transition towards HTML based alternatives, or to more mobile friendly platforms.
As I was recently looking for a mobile-friendly web-based game, I had the opportunity to review a  number of HTML5 engines, and there are not less than 17 known major initiatives of interest at the moment, to build and deliver Javascript games on mobile devices. I’d like to highlight and support here the remarkable promises of CraftyJSTreeJS, and VoxelJS. Prefer Chrome to IE or Safari to browse the available showcases, and brace yourself: This is happening without any plugin!
On the richer side of things, which still sits in a 3D + realtime space browsers alone struggle to explore, I have chosen to rely on best of breed and fast growing solutions such as Corona to create 2D platform games or the award winning Unity3D to create advanced 3D games and applications, including Augmented Reality. And if these new contenders on the market of cross-platform game engines have initially tried to bet on Flash as an authoring platform and universal player (here for Corona, and there for Unity3D), the conversation has recently dramatically shifted, due to the fundamental lack of open-mindedness, concessions and ultimately inspiration on Adobe’s ends. While Corona were bold enough to speak out loud their reserves as soon as October 2011, Unity3D tried hard again with Flash Player 11, but finally threw away the towel in April 2013 with this definitive announcement.
Today, new desktop, mobile and console friendly engines are fully emancipated from the old and cranky grandpa, and delight developers and publishers with native device support, as well as a dedicated web plugin for Unity3D, which as seen exponential growth thanks to its remarkable success on the Facebook games market. The 2 companies have even recently disclosed a technical partnership to make publishing even easier, and they have reported the web plugin installed on over 200M computers around the world. A new star is born, and I am happily riding it.