MWD03 – Provisioning a local development stack

In the previous post, we set up the Mac workstation and got it ready for modern web development.
In this chapter, we’ll discuss the next key step in setting ourselves up the right way to develop a web application, and this is about creating and provisioning a development environment.
Using Linux is not a crime!
VMs are fantastic
If you are planning to create your app using PHP, Java, Python and or Ruby, then there are 90% chances you will do that on a Unix/Linux powered stack. Otherwise, you would go for Windows, and anyway things would not be very different.
Before we throw money through the window renting a server on the cloud, let’s be practical and consider the most obvious option, which is to leverage your own local workstation to setup a virtual environment. Note again that I advise against using platform ports of xAMP (Apache-MySQL-PHP) and there are a few good reasons for that, along the lines of consistency:
  • Operating system discrepancies (starting with file systems)
  • Software versions
  • Files and folder permissions
  • Stability
This said, the best thing to do is to provision a virtual machine which replicates as closely as possible the target production environment. For this end, we use a virtualisation platform like Virtual Box, as proposed in the previous article, and can install with it any preferred OS stack. Let’s assume a CentOS 6.5 64bits for the example, but it could be anything else, including a custom and home brewed VM.
Fortunately for us, instead of downloading the ISO at cents.org, and going through the full install process, ready made boxes are available on the web, and I can mention the following repositories:
My Vagrant is rich
My Vagrant is rich!
Vagrant is an amazing, accessible and free utility, and I hardly see how the modern web developer could ignore it. It allows them to create and configure lightweight, reproducible, and portable development and staging environments, with the exact combination of services and utilities I need for my project. And obviously I will consistently use and refer to Vagrant hereafter, as I am now using it both in my hobbyist and professional lives.
The basics for Vagrant are very well explained on the official site, here: http://docs.vagrantup.com/v2/getting-started/index.html
To install Vagrant, just visit the page and download the right package for your OS: http://www.vagrantup.com/downloads.html
Done, we are ready to provision our Linux stack for development (and possibly staging) purposes:
As I am a RedHat/Fedora/CentOS enthusiast, I go for a CentoS 5.5 64 bits stack, which I pick from the shelves of Vagrant Cloud (but could have been anywhere else): https://vagrantcloud.com/mixpix3ls/centos65_64
This one has been setup with the Virtual Box Guest additions (an I have a specific short article to help you out with upgrading your VB guest additions in case you update Virtual Box).
Let’s first create a working folder:
     $ mkdir ~/sandbox/my_project
     $ cd ~/sandbox/my_project
Now I initialise my local Linux stack:
     $ vagrant init mixpix3ls/centos65_64
=> This immediately create a local .Vagrantfile in your project folder, which you can freely edit and tweak to suit you need, as we will see later.
One thing you might like to immediately do though is to organise proper web port forwarding by inserting the following line in this .Vagrantfile:
 
config.vm.network “forwarded_port”, guest: 80, host: 8080
As you understand, this .Vagrantfile should later on be part of your GIT repository, as a way to share with your fellow developers what sort of server environment your app is supposed to run on.
For now, let’s just switch the Linux machine ON:
     $ vagrant up
 
It will take some time to download the gig of data corresponding to the image, just look at progress in your terminal window. But eventually your VM is up and running, and you can seamlessly SSH into it using the simple command:
     $ vagrant ssh
 
3 commands in total: Isn’t that amazingly straightforward?
In case you wonder where all this magic happens, beyond the 5kb .Vagrantfile stored in the project folder:
 
This is great but still a little bit bare bones for a web server, and we now have to continue provisioning the image at least with a web server and a database server.
Do you manual?
Do you manual?
The old dog and control freak in me can’t help, at least once, to do it the hard manual way, and that’s what it looks like:
$ sudo yum update # Careful because a Kernel update may ruin the Virtualbox Guest Tools
$ sudo yum install ntp
$ sudo service ntpd start
$ sudo rm /etc/localtime
$ sudo ln -s /usr/share/zoneinfo/Australia/Sydney /etc/localtime #properly set server time to this part of the world we love
$ sudo yum install nano # I need my fancy text editor
$ sudo yum install httpd # This is Apache, and you could choose Nginx or Tomcat
$ sudo service httpd start
$ sudo chkconfig httpd on
$ sudo nano /etc/httpd/conf/httpd.conf # Change Admin email
$ cd /etc/httpd/conf.d
$ sudo mkdir vhosts
$ sudo yum install php php-mysql
$ sudo yum install php-* # all extensions, because I can’t exactly tell which ones I need for now
$ sudo nano /etc/php.ini # Change Memory parameters
$ cd /var/www/html
$ sudo nano /var/www/html/phpinfo.php
$ sudo tar -jxf phpMy*
$ sudo rm phpMyAdmin-4.1.13-all-languages.tar.bz2
$ sudo mv phpMyAdmin-4.1.13-all-languages/ phpMyAdmin
$ cd phpMyAdmin
$ sudo find . -type f -exec chmod 644 {} \;
$ sudo find . -type d -exec chmod 755 {} \;
$ sudo mv config.sample.inc.php config.inc.php
$ sudo nano config.inc.php # Change blowfish secret
$ sudo rm -R setup
$ yum install php-mcrypt # these 2 lines are necessary to install MCrypt, and be able to deliver some reasonably serious work around cryptography
$ sudo yum install mod_ssl
$ sudo service httpd restart
$ sudo yum install mysql mysql-server
$ sudo service mysqld start
$ sudo mysqladmin -u root password ******
$ sudo chkconfig mysqld on
$ sudo yum install git
$ sudo nano /etc/conf/con.d/my_website.conf
$ sudo service httpd reload
Now just hit
=> http://127.0.0.1:8080/phpMyAdmin/ to access MySQL via PHPMyAdmin
=> http://127.0.0.1:8080 in your local browser, and you should see the magic happening, with the default Apache page.
It does not seem much, yet you might quickly spend 30min+ going through the above, if everything goes right and you do not inadvertently commit any typo.
Plus what you’ve just done, your fellow developers in the team would have to do it as well, as they VagrantUp their own local development environment: This is of course not acceptable.
One obvious and immediate countermeasure is to share the custom BOX we’ve just created with our peers via network or cloud storage. This is well covered by Vagrant here: http://docs.vagrantup.com/v2/boxes.html . Your peers would simply have to use the box add command:
$ vagrant box add my-box /path/to/the/new.box
$ vagrant init my-box
$ vagrant up
However, this is still a bit brute force and not fully flexible and future proof: What if I realise I need to add a specific service or configuration? I would have to update my box and to copy it again over the network for my peer developers, and moving around 1Gb of data is not the most elegant thing to do, is it?
Therefore we are looking for a more scripted and flexible way to provision our Linux stack on the fly.
In my next article, I will discuss a couple of simple enough yet professional solutions to provision your development environment in a robust and agile manner using either Chef or Puppet.
Next to MWD04 – Provisioning the stack with Chef

MWD02 – Get the MAC workstation ready

Continuing this series about Modern Web Development!
As a non Microsoft Developer working on a Mac, you need to get yourself a few productivity tools which unfortunately are not all available out of the box on your Macbook or iMac.
STEP 1 – Starting with a few Mac OS X enhancements
Total Finder Features
The Finder: Even with the latest OS X 9 Mavericks update, the Finder still has some shortcomings, and in particular still lacks the ability to quickly switch display of hidden system files. My recommended solution for that is to switch to TotalFinder  that is US$15 well invested for your productivity. And it’s more lightweight and stable than the very ambitious Pathfinder (US$39)
Total Terminal
The Terminal: The native Mac terminal is OK, but not always easy to summon in context. To improve usability in that space, I have taken 2 actions: First, I have downloaded the excellent and free TotalTerminal utility from the same BinaryAge company publishing Total Finder. With its Visor feature, you can always summon the Terminal in context using CTRL+~, very handy. And secondly, I am leveraging the new service available in Mavericks which allows to open a “new Terminal tab at Folder”, as per instructions found here. Right-click any folder, select Services>Open Terminal, and here you go. A true game changer.
Textmate 2
Textmate 2A decent Text Editor: Textedit is definitely outdated, and I wonder when will Apple decide to give it a bit of attention span! In the meantime, the modern developer will invest in a Sublime Text or Textmate 2 license. My preference goes to the latter which has a much more comprehensive syntax library.
Other CLI developer tools:
Mac OS X native tools: First we need Mac native tools, which are not installed by default but usually come with Xcode. If you are not planning to develop Mac or iOS apps and do not need Xcode just yet, these tools can simply be installed by using the following command in Terminal: $ xcode-select —install or even simply $ gcc
This will trigger a dependency pop-up indicating that Xcode requires the CLI developer tools and offering to install them: Just click install and wait for 30min.
CLI Developers Tools
Java: Java is an important language, and unfortunately it is not available natively anymore in the latest releases of Mac OS X.
The current Java version recommended by Apple is 1.6 and it can be downloaded here.
If for some reason you need to work on Java 1.7, visit the official Java website, at your own risk:
Ruby and Python: Ruby and Python, two immensely popular object-oriented scripting languages, have been installed as part of OS X for many years now. But their relevance to software development, and especially application development, has assumed an even greater importance since OS X v10.5. And as a modern developer, in particular in the PHP space, you are likely to use Ruby scripts for automation and deploy on, with Composer, Capistrano and Chef for instance.
Ruby with HomebrewHomebrew  s a package manager for Mac and it helps installing additional services and utilities, that Apple didn’t, in an organised manner, managing symlinks into the /usr/local folder. Just like CHEF consumes recipes written in Ruby forserver provisioning, Homebrew consumes Ruby-written formulas to install software packages on your Mac.
To install Homebrew (Note a dependency on XQuartz)
$ brew doctor
To install Ruby with Homebrew:
$ brew install ruby
To manage the different versions of Ruby you may have on your system, use RVM:
$ \curl -sSL https://get.rvm.io | bash -s — –autolibs=read-fail
PHP Storm
STEP 2: Get a good IDE
PHP StormThere are several options on Mac and I can mention here the well know Eclipse (All languages), Coda 2(PHP), Netbeans (Java, PHP, C++) or Aptana Studio (PHP, …)
But my preference now clearly goes for the great and consistent suite of IDEs offered by JetBrains for the different languages:
  • IntelliJ for Java
  • PHPStorm for PHP => That’s the one I am mostly using (US$99 for a personal license)
  • Web Storm for general purpose HTML/CSS development
  • PyCharm for Python
  • Rubymine for Ruby
  • Appcode for iOS development
Atlassian Sourcetree
STEP 3: Source control with GIT
GIT is essential and mandatory, and what you need is the holy trinity: The command line on your workstation, a repository storage service, and a GUI client.
Mac OS X 10.9 comes with GIT 1.8.5 bundled, which is not the most recent version.
To upgrade GIT on Mac, the official GIT site proposes an installer.
Another option is to use Homebrew, with:
$ brew update
$ brew install git
This will get you the latest GIT, version 1.9.2 as I write these lines.
.
Source TreeIn terms of GUI, I enthusiastically recommend Atlassian Sourcetree, which by the way will come with its own version of GIT bundled in, for self-consistency purposes. (GIT 1.8.4 for Sourcetree 1.8.1). The desktop client for Mac proposed by Github is also a decent one.
.
Lastly, make sure you have an account on Github, offering public repositories only for the free account. For this reason, I urge everyone to consider Atlassian Bitbucket, which offers free accounts with illimited private repositories for teams of up to 5 members: It can’t get any better!
.
TransmitSTEP 4: Other essential network utilities
.
For FTP/SFTP and S3 access, I have found nothing better than Transmit 4 from Panic Software (US$34).
If you are after a completely free software though, you’d would fall back to Cyberduck 4. And finally, Filezilla has now enjoyed a decent port for Mac, but unfortunately lacks S3 support.
.
NavicatFor database connectivity, the “ned plus ultra” solution is Navicat  which offers connectivity to MySQL, PostgreSQL, Oracle, MS SQL Server, SQL Lite, MariaDB. Choose the Essential edition for US$16 only, or go pro with Navicat Premium ($239).
If you use exclusively MySQL, then Sequel Pro is the perfect and free solution. Likewise for PostgreSQL, simply use the Postgres.App.
.
.
Charles ProxyAlso, you need a web debugging proxy and the obvious choice here is Charles 3.9 (US$50).
Interesting to note though that Telerik is currently porting the remarkable Fiddler to Linux and Mac (Apha release).
.
STEP 5: Virtualisation software
First and foremost, I would strongly advise against setting up localk xAMP suites, such as MAMP and XAMP for Mac. It may look handy at first glance to run your Apache and MySQL servers locally, but it quickly becomes a pain in the back with compatibility issues, when you start using PHP and Apache extensions. Very messy and inaccurate.
.
Virtual BoxAccording to the idea that there is “nothing like the real stuff”, I urge any serious developer to work with locally virtualised environments replicating LIVE destination environments, whatever the operating system.
The best choice is to use Oracle Virtual Box, free and well supported.
Now you can manually install any of your favourite OS, and set your code folders as shared mount points.
.
VagrantBut the latest delightful trick to do so is to use Vagrant to flick up an environment in 1 command line and no time:
Download Vagrant for your system, and then in the terminal simply VargrantUP! You can select an image on Vagrant Cloud: https://vagrantcloud.com/discover/featured
$ cd /~/my_dev_folder // Putting yourself in the folder you intend to store your web app
$ vagrant init hashicorp/precise64  // Installing a standard Ubuntu 12.04 LTS 64bits box
$ vagrant up
$ vagrant ssh // You are logged in!
In just a few minutes, you are logged into your VM and can start setting up the services you need fort your app, starting with Apache, MySQL, PHP and more.

From Mobile First to Context First

Summary: As we are stepping into the “delight economy”(10), in which you aim at turning first time visitors and adopters into lifetime consumers and brand advocates, user experience and engagement are essential. And if device awareness was a vital step to take, to deliver optimised user interfaces, it was a baby one, we are still to embrace more broadly the necessity of context awareness, in our online presence and web services, to maximise engagement and conversions.
Mobile is dead

I read this interesting article on the blog of SDL  the software company behind the enterprise CMS Tridion  The title was purposefully provocative: “Mobile is dead” (1).

Making a point around the fact that it’s great for a brand to have a mobile friendly public website, a couple of mobile apps for iOS and Android, mobile responsive landing pages, and even a mobile first strategy and fully device aware web services… but it’s just the very first step in a more ambitious journey towards the user, his needs and his context.
The 5W questions
From Mobile First to Context First” to summarise it in a concise formula, after Peter Sena from Digital Surgeons (2), and if you google it, you’ll quickly grab an eclectic selection of bookmarks on the topic, walking you through user centred and context driven design, and leading you onto the adventurous ground of page less design (3), smarter websites and digital storytelling (4) in the delight economy (5).

This is all very exciting and promising. It explains why leading CMS vendors like SDL, Sitecore and Adobe are fiercely competing on this new segment of CXM, or Customer Experience Management, seeking the Grail of ultimate context-based personalisation. And it also justifies the rapid emergence of all these startups focused on analytics and performance monitoring: Crazy Egg  Qualaroo  OptimizelyNew RelicFuel Deck, etc …

Yet, I can’t help wondering if all this technology is going to help me actually getting right my “context first” strategy without a radical change of perspective on the online presence of my company and my brands? Measuring variety and complexity is great to understand it and gain insights, but it can be outrageously onerous to action, as any digital marketing manager can tell. And even so, once all is measured and charted, can that really help me overcoming complexity, reaching this state of simplexity (6) I am looking for, and starting to produce delightful experiences which will turn my visitors into lifelong consumers and advocates?

Context
The 5W questions
Maybe this needs a whole new view on what a website should be. Context First is a good starting point, asking the essential 5W questions of Who, What, When, Where and Why. So let’s discuss how web technologies can help us in answering these questions. We’ll discover that we are not necessarily talking about million dollars software here, but most often simple native browser and network capabilities.
Who: Let’s clear up this one, which is pretty obvious. Unless you have NSA clearance, there’s no way you can tell and guarantee who is visiting your site. Cookies, and in particular those from social networks offer some interesting footprint, unless you explicitly ask your visitor through a form or a call to action
When, Where: These are the easy ones, thanks to the native capabilities of browsers and devices and web servers. We can even know “Where from…”, through the the HTTP referrer parameter, which most often provide invaluable pieces of context.
Why: A subtle question which, outside of an explicit call to action, can only be dealt with in a predictive manner, by matching the known session parameters with preset personas and scenarios.
 
The 3S: Search, Social, Spam
In fact, direct or bookmark access has decreased over the past few years, as most of the traffic comes from either Search engines (primarily Google), and increasingly from Social networks (primarily Facebook). According to recent studies, and depending on industry verticals, between 40% and 60% of the traffic comes from organic or paid search, and around 20-25% comes from social and referrals, leaving only 20%-40% to direct visits. (7) (8) Apart of this last chunk would be generated by another powerful source of traffic: Email Direct Marketing, EDM or “Spam” for the sake of pulling together our 3S: Search, Social, Spam.
So WHY are visitors coming to my website? Because they’ve been searching for something, because a friend has shared an interesting information with them, or because they have received a call mail call to action. In all 3 situations, your website should be able to capture this information, and pull together some context for personalisation.
What: This question 2 fold, as it involves the visitor but also you as a publisher. What is the visitor looking for, and what am I able and willing to propose to him or her. Back to the WHO question, this can only be determined through proactive guess working, via carefully designed calls to action in my landing page.
The 3 clicks rule for engagement
Obviously, there will always been situations where you can’t collect much about your direct visitor, due to their privacy settings for instance. Therefore you’ll have to get the best out of their first few clicks and interactions to profile them and tailor their experience. As a rule of thumb, and beyond the myth (9), aim at locking WHO they are and WHAT they are after in a maximum of 3 clicks, in order to serve them best informed content from the 4th one and onward, and produce delight.
Context First
Conclusion
So what about Mobile and browsing devices and software in general: Does it really matter that much? It certainly does from a usability perspective and we ought serving the most suitable touch UI to visitors on smartphones and tablets. Again, it’s about generating engagement through delight, via the best use of available capabilities.
But beyond that, talking context? I can browse with a smartphone on the couch, with a tablet from work, and with a laptop from a coffee shop terrace. Does the device really determine my context? Probably not, at the end of the day it’s mostly a game of probabilities .
Device awareness was a vital step to take, to deliver optimised user interfaces, but it was a baby one. Context is everything and web publishers are urged to build context aware web services, beyond responsive or adaptive brochure-ware websites, there’s a real need for something like smart agents to acknowledge and properly greet and guide visitors, and leave them with a memorable first impression, just like I’m a brick and mortars experience. This broader “Context First” perspective could bring within the next few months some radical changes in the way we even think online presence, and trigger a massive upgrade of current brochure-ware websites into a new breed of smart web agents. To be continued along these lines.
References:

 

Digital engagement in the post-Flash era

Flash is dead, long live the rich web! Beyond the now very conventional statement, it is about time for digital marketers and creative developers to acknowledge the reality of a “yet again” moving landscape, and to understand what is the new winning way of digital engagement, in a multi-screening world. Strategists, UX specialist and creative technologists are now busy scouting and monitoring the most innovative trends, at their bleeding edge. Here are some of their findings in the field of cross-browser and cross-device support, mobile applications and gamification. Welcome to the post-flash era.”

About one year ago, I published this private blog post, about the end of Flash and the emergence of new ways to engage the audience in the multi-screen world. For the records, I have copied a moderated version of the full article below, but in a nutshell what I was saying back then is the above.

The multi-screening world was, and still is a reference to the famous report from Google available here: http://www.google.com.au/think/research-studies/the-new-multi-screen-world-study.html

Google Study - The Multi-Screen World

In another article published in B&T in November 2013, I also mentioned the fact that the future of digital engagement would largely rely on richer and interactive video and 3D/immersive experiences, and I shortlisted Rapt Media as a promising new contender in this arena.

Erika TrautmanThis said, it was only half a surprise today to read this article on The Next Web:

RIP Flash: Why HTML5 will finally take over video and the Web this year“, by Erika Trautman, CEO of Rapt Media.

And I can only strongly agree to the invoked key factors: Mobile and semantic markup for SEO and social.

Well done Erika, now following you on Twitter to learn more about your vision of the rich and semantic next web :-)

 

## Moderated Article – August 2013 ##
 
Image
Flash is awesome … or at least it used to be consensually rated so, referring here to good old Macromedia Flash, then turned into Adobe Flash. I used to have significant Flash staff by my side across my career, and these hyper-specialist developers were high on demand, at least until 3-4 years ago. As of today though, Flash developers are busy fast tracking their skill set diversification and looking at other languages, as the demand and expectations from customers evolve.
Not only that I stop recommending flash, but more because of a reduced market’s appetite. Not because of a mere commoditisation process, as this usually happens in the software and service industry, but because of more disruptive changes. And if the well known enmity with the Apple devices ecosystem is commonly invoked, I believe the sunsetting journey of Flash has been accelerated by powerful and conjugated groundswells in other areas, namely broadband, HTML5, social as in user generated content, analytics, and SEO. This is probably what recently led large thought leaders such as Microsoft and Google to drop Flash support in their platforms.
“What a shame!” will the Flash enthusiast say, when you think of all the incredible features it was bringing to our desktop screens: Vector based lightweight graphics, HD audio and video streaming, microphone and webcam access, low-level bitmap manipulation, hardware accelerated 2D/2.5D animations, plus a strongly type and class-based programming language and binary-based sockets.
Note 1: as of August 2014, on seek.com.au, there were 73 job ads for Flash developers, to compare with 453 for HTML5 developers, and 823 for mobile developers. 
Image
Well, one can bluntly reply that HTML5 is now competing, and even beating Flash in most of these areas, and if we used to say that HTML5 was a standard in motion, this is now an overstatement since the HTML standard will be final by the end of 2014 (source W3C).
Take animations for instance, and look at what you can achieve with CSS2 and CSS3 on all mainstream browsers. Beyond the confidential and exotic Chrome Experiments, heavyweight Adobe has already acknowledged that through its HTML roadmap and the new EDGE suite of HTML tools. And that’s without also mentioning the capabilities of the SVG vector graphic format.
In the media arena, the native VIDEO tag now allows to play back of H.264 encoded MP4 video in all modern browsers, including Microsoft IE9 and 10, without a plugin. But there’s more, as today the Chrome and Firefox browsers natively offer even access to the user’s microphone and webcam through a recent Media Capture and Streaming recommendation (getUserMedia). And to nail it down, there is an open source project called WebRTC  enabling real time video communications through a Javascript library already supported by Chrome and Firefox as well.
With CSS Filters 1.0 around the corner for dynamic bitmap manipulation (check out this nice demo), and ECMAScript 6.0 bringing more robustness and concision to Javascript syntax, the sky seems to be the limit.
Over the last few months, I have carefully observed these developments and trends, in order to make informed decisions regarding the selection of new technology stacks. And I can mention and share my findings in 3 specific areas:
Image
Cross-browser support and device adaptation
This used to be a strong Unique Selling Point for Flash, given the massive penetration and ubiquitous character of its browser’s plugin. However mobile has been a game changer, with an unprecedentedly large number of screen sizes and aspect ratio combinations, in front of which scaling up and down graphics is just not enough. And a promotional banner or interactive piece of content will most likely look a bit odd or even not usable at all, once shrunk within a 4” screen.
In the meantime, modern HTML has enabled consistent and adaptive handling of rich content across multiple browsers, and when new HTML5 and CSS3 features are not enough, then Javascript and server-side browser detection and processing help bridging the gaps. Don’t be mistaken though, this does not mean that the browsers war is over, and the market is still shared by 4 vigorous contenders, if to just consider desktop and laptop browsers only. And if Google Chrome is now significantly leading the pack (see the latest Statcounter figures for instance), Microsoft Internet Explorer, Mozilla Firefox and Apple Safari are tagging along (The same situation prevails globally on Mobile).
When I hear the word of “fragmentation” regarding browsers, this is certainly a reality that still needs to be addressed, since support of the most recent HTML norms and features stays diverse and incomplete. In that regard, I recommend to bookmark 2 useful web services:
  • http://html5test.com allows you to measure the compliance of your browser with the HTML5 standard as a whole. On a scale of 0 to 500, without surprise Chrome leads with 463 pts as of July 2013, versus only 320 for IE9 and IE10.
  • http://caniuse.com provides an extensive list of HTML5 + CSS + Javascript and more features that you can check versus the particular browsers matrix you are targeting for your ongoing projects and customers.
To circumvent the challenge of outstanding inconsistencies between browsers versions and screen sizes, the only valid answer is abstraction. This is what so called HTML5/Javascript frameworks are proposing and this is why they have grown more and more popular over the last couple of years, propelled by buzzwords such as “Responsive Design” or “Device Agnosticity” for instance. My goal here is not to establish an endless list of all initiatives and software in this area, but to give a sense of why I have picked some development tools. Just to mention my mainstream options:
 
Image
Mobile applications
We all saw how Apple waged war to Flash, excluding to support it on iOS devices on the basis of stability, security and performance issues, and despite some support found at Google within its Android system, this has certainly triggered the downfall of the “15 years old patriarch”. Lately, I have followed noticeable attempts by Adobe to regain ground on iOS via its Flash+AIR platform, but the mass seems over, with Google giving another lethal blow by dropping Flash support in Android 4.1 and beyond.
Australia certainly is a particular market with a persistent domination of the mobile market by Apple, as per latest StatCounter and Google stats  Nevertheless, the trend stays the same and other contenders are cracking in, Blackberry still alive in the B2B segment and Nokia/Microsoft nibbling the heels of Google Android. The days of iOS only native applications are long over and any serious digital marketer knows that he needs to provide a slick mobile experience on key dominant and fast growing platforms to grab the laurels.
And if you can often get away with a neat mobile web experience for fulfil the needs of a branding site or a marketing campaign, there are still some strong and relevant use cases for native mobile applications, and at least the following:
  • File system interactions on the device: Manipulating and exchanging documents back and forth between local storage on devices and cloud services.
  • Media interactions: Leveraging the camera and the microphone is better done within a native and locally running app. Take a simple barcode/QRcode scanning feature for instance, but consider also the engagement potential of Augmented Reality.
  • Gaming: Despite the ubiquitous broadband network, there are still situations when the user is off the grid, the game assets are too heavy, or the user experience cannot suffer from unreliable streaming.
  • Interoperability: Users have more and more appetite to multi-task and move content across multiple feature centric apps on their device, regardless of network conditions and real time access to cloud services.
  • Encryption: With recent coverage in the news of massive privacy infringements, not only by hackers but also by government services, strong data encryption is essential, starting with the Finance vertical.
It is a daunting challenge for a digital professional to maintain skills and know how across the 3 short tail mobile stacks, namely Apple iOS, Google Android and Microsoft Windows Mobile. For this reason, I have chosen to favour cross-platform development, leveraging frameworks such as XamarinTitanium Appcelerator and Adobe Phonegap .
Image
Gamification
This also used to be an arena dominated by Flash, and it is obviously still very high on Adobe’s latest product roadmap , as this might ultimately be its best survival option. But the rise of casual gaming in Facebook and on mobile phones, has accelerated the transition towards HTML based alternatives, or to more mobile friendly platforms.
As I was recently looking for a mobile-friendly web-based game, I had the opportunity to review a  number of HTML5 engines, and there are not less than 17 known major initiatives of interest at the moment, to build and deliver Javascript games on mobile devices. I’d like to highlight and support here the remarkable promises of CraftyJSTreeJS, and VoxelJS. Prefer Chrome to IE or Safari to browse the available showcases, and brace yourself: This is happening without any plugin!
On the richer side of things, which still sits in a 3D + realtime space browsers alone struggle to explore, I have chosen to rely on best of breed and fast growing solutions such as Corona to create 2D platform games or the award winning Unity3D to create advanced 3D games and applications, including Augmented Reality. And if these new contenders on the market of cross-platform game engines have initially tried to bet on Flash as an authoring platform and universal player (here for Corona, and there for Unity3D), the conversation has recently dramatically shifted, due to the fundamental lack of open-mindedness, concessions and ultimately inspiration on Adobe’s ends. While Corona were bold enough to speak out loud their reserves as soon as October 2011, Unity3D tried hard again with Flash Player 11, but finally threw away the towel in April 2013 with this definitive announcement.
Today, new desktop, mobile and console friendly engines are fully emancipated from the old and cranky grandpa, and delight developers and publishers with native device support, as well as a dedicated web plugin for Unity3D, which as seen exponential growth thanks to its remarkable success on the Facebook games market. The 2 companies have even recently disclosed a technical partnership to make publishing even easier, and they have reported the web plugin installed on over 200M computers around the world. A new star is born, and I am happily riding it.

MWD01 – Embracing the evolving digital world

I have been coding and managing digital projects for the last 15 years, and this has already been through a number of evolutions, revolutions and complete paradigm shifts. Just listing a few areas to try to illustrate were I’m going here:

GITSource Control Management (SCM): aka CVS , Perforce, and ClearCase back in the days. Subversion had some glorious days, but over the pas few years the crowd has moved to GIT, made popular and ubiquitous by the amazing code sharing platform Github.  Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency. It is easy to learn and has a tiny footprint with lightning fast performance. It outclasses other SCM tools with features like cheap local branching, convenient staging areas, and multiple workflows. It’s best suitable for distributed teams working in a peer to peer context and it is certainly the SCM of the decade, you can’t avoid it, even if you’ll keep an eye on emerging competitors like Mercurial.

Java - Php - MySQLOpen-Source Software: Long gone are the days when you’d pay for closed source software without a blink, with the double risk of getting ripped off with ongoing maintenance costs, and being locked into a technological dead end. Web technologies are experiencing a sort of Cambrian explosion on an incredibly short timespan. People try before they buy, they want to access source code to customise it and suit their specific requirements, share it with business partners to encourage interoperability. As a customer today you will happily pay for a service and a subscription rather than for the codebase, you will pay for advanced features and integration rather than for the core functionality. It is on the ground of such a need for interoperability and agility that the open-source model has grown and become mature including in the Enterprise Arena. It is true in all compartments: Operating systems with Linux distributions such as RedHat and CentOS, databases with MySQL and PostgreSQL, server-side languages such as RubyJava and PHP, content management frameworks such as Drupal or Alfresco, and it is just the beginning. “Start free and scale up with service and features” is the new motto in the global era, and far from just “open” and “free”, Open Source is a true guarantee for stability, security, interoperability, enterprise grade.

Symfony and ComposerDependency management: Along the lines of interoperability, we see a growing number of technologies built on top of large aggregations and third party dependencies, and this leads to an increasing interest in managing software packages, bundles, plug-ins, add-ons, … An iconic example in the CMS arena is WordPress which proposes the most comprehensive library of 3rd party plugins to address various needs and features, from simple contact form handling to broad scale e-commerce. At a lower level, this translates into component libraries for web applications frameworks such as Symfony (PHP best managed with Composer for instance) and Rails (Ruby best managed with Bundler), and down to languages themselves, with Pear and PECL repositories for PHP, and Rubygems for Ruby. This is now even investing the field of deployment automation with platforms like Capistrano, Vagrant and Chef.

MVCModel-View-Controller (MVC): MVC is the modern software pattern for implementing user interfaces in a scalable and future proof manner. It divides a given software application into three interconnected parts, so as to separate internal representations of information from the ways that information is presented to or accepted from the user. The central component, the model, consists of application data, business rules, logic and functions. A view can be any output representation of information, such as a chart or a diagram. Multiple views of the same information are possible, such as a bar chart for management and a tabular view for accountants. The third part, the controller, accepts input and converts it to commands for the model or view. It is clear that over the last 5 years all dominant and enterprise grade languages and application frameworks have quickly evolved towards the recognition of this universal and most desirable pattern: Java with Spring for instance, PHP with Symfony, CakePHP, Codeigniter, Yii and more, .NET with ASP.NET, Ruby with Rails. Even Javascript currently undergoes the same rapid evolution client side, with amazing and highly performing framework like Backbone, Knockout or the most recent Google AngularJS.

Web services and apisAPIs and web services: Going hand in hand with the MVC revolution, the emergence and silent multiplication of web services and public APIs (Application programming interface) on the web is staggering when you consider it: There’s a crowd of agents and open endpoints out there just waiting for you as a developer to leverage them and unleash their power within your application. Programmableweb maintains a directory of 11,000+ APIs, and they are not alone (here and here for instance). All world-class service providers offer their own, such as Google. APIs are driven by a set of specific technologies, making them easily understood by developers. This type of focus means that APIs can work with any common programming language, with the most popular approaches to delivering web APIs being SOAP and REST. REST with JSON has become the favorite of developers and API owners, because it is easier to both deploy and consume than other implementations. Even though REST + JSON is not a standard, it is now seeing the widest acceptance across the industry. Again, interoperability is the key driver here, and it is important to make sure your applications can consumer 3rd party web services, and expose new specific ones in the most robust manner. Online providers event specialise in the brokering of web APIs, like for instance Zapier, ElasticIO and Talend.

VirtualisationVirtualisation: I can’t really remember the last time I had to play with a real physical server on which I installed Windows or Linux to create a web hosting environment: It was probably more than 10 years ago, on a super expensive and rackable Dell pizza box. Since then, all my web hosting providers have been offering virtualised environments on mutualised hardware, and this has grown to a super massive scale lately with a pack of providers led by Amazon and including Rackspace, Digital Ocean, Brightbox and many others. As I write these lines, I have 4 different systems running on my Mac, for development and productivity purposes, courtesy of Vmware Fusion or Oracle Virtual Box: Beyons Mac OS X 10.9 as a main host, I have Windows 8.1 and 2 server flavours of Linux (CentOS 6 and Ubuntu 12). Virtualisation means versatility and cost efficiency, since anyone can now run a fully fledged web server from home for just a fistful of dollars.

Web-Scale ITCloud and web-scale IT: Cloud has been a buzz word since a few years now, but we are still hardly realising what it means in terms of volume, scale and commoditisation. Amazon has been pioneering this space and still largely leads it, allowing individual and businesses to fire up dozens of virtual servers and and run web apps anywhere in the world in just minutes, for competitive monthly charges based on consumption. What it really means is that IT infrastructure is not anymore the realm of super specialised engineers and techs, it is now almost completely commoditised, and we enter the age of what is called “Infrastructure as code”, where any mid weight dev-ops can fire up a world class architecture of undress of servers by simply running a shell script of a few lines: Literally scary!  New contenders in this market of IT automation and web-scale IT are AnsiblePuppetLabs, Docker and Chef for instance. And these guys are going to take the industry by storm over the next 2-3 years.

Monitoring UXUser Experience Optimisation: Last but not least, as the years 2000’s have been the decade of CMS, the current period is undoubtedly focused on better managing user experiences online, and that’s achieved through a variety of means, including HTML5 rich and device agnostic user interfaces, personalised and interactive content. But at the end of the day, this is only possible through an extremely granular monitoring of performance, instant capture of exceptions and errors, and fine grained analytics. Google is still a heavyweight in that space, and enterprise content management systems battle fiercely to stay visionary leaders (Sitecore Customer Engagement Platform, Adobe Experience Manager, SDL Tridion, …) However, from a developer perspective there are huge opportunities to create value in that space, through server side and client side monitoring of web apps. And a significant number of startups have invested that space, such as NewRelic, Sentry, FuelDeck. These are essential components to deliver optimised online experiences.

Next to Part 2 – Getting the MAC workstation ready