Gary Lefman “There will be a greater understanding of the need for internationalisation”
I’m glad to get back to blogging after half year break and to present you new interview with Gary Lefman, who works in Cisco and has superpowers in localisation and internationalisation :) You can have a look on his blog.
If you missed other great interviews or want to renew them in your memory, please welcome to the interviews page.
Please tell a few words about yourself. How did you get into internationalisation and localisation?
My response will probably sound very familiar to most people in the localisation industry. The reason I am where I am today, and the reason why I have modelled my life around this industry is simply because of an accident – a disastrous accident – if that makes it seem any less passé.
I was invited to join Cisco in 2000, before I had finished my undergraduate degree, and became a network engineer with a research and development group specialising in telecommunications protocols and switching. It wasn’t long before I started to design, build, and manage development labs in England and China.
Somewhere around the time the dotcom bubble had a blow-out, management volunteered me to rectify a tragic cocktail of localisation issues with a prominent voice product. I had never heard of the term localisation, let alone know what it meant at the time, but I threw myself into the project with full gusto – blind and naïve – as a good engineer always should, and solved the problem. Unbeknown to me, then, I had altered the course of a crushed localisation project, which subsequently exploded from four locales into 52 in almost no time at all. Having been thrown to the wolves, and walked away unscathed, I had gained a level of recognition and respect that fuelled my decision to switch to the dark, and far more exciting I might add, side of voice localisation engineering.
With an abnormal thirst for producing truly global products, it was only natural that I should move into an architectural role and focus on developing an internationalisation strategy for the entire engineering organisation. This involved developing an internationalisation support structure for developers and internationalisation champions, and a full training programme to cover all aspects of product and content internationalisation to multimedia localisation. In the meantime I have been working on several projects outside of Cisco.
At the end of 2013 I graduated with first class honours from the University of Limerick with an MSc in Multilingual Computing and Localisation, via the Localisation Research Centre. Not wanting to stop there, I am now working on my PhD with CNGL the Centre for Global Intelligent Content at Trinity College in Dublin. I am also a partner (and CTO) in a fantastic company that redefines localisation education (launches 1 June 2014), and a director for an internationalisation consultancy.
In December 2013 my first book was published, called Internationalisation of People Names. It’s a study of human name structures around the world and a model to prevent identity loss within computer systems. Earlier this year I was recognised as a Fellow of the British Computer Society, and a Fellow of the Royal Institution of Great Britain before that. Combine all this with a strong background in internationalisation and localisation engineering, I’d say that I still have a very long way to go, but in the meantime, you can keep up with me on Twitter as @CiscoL10N and in LinkedIn.
Can you describe some peculiarities of Internationalisation platform for Cisco products you developed?
Up until five years ago I would have said there was nothing peculiar about internationalisation. Throw in a few standards and best practices, and anything is possible.
But today, with bring your own device (BYOD), and the so-called Internet of Everything, we have really mixed things up. To the point where developers are rushing to produce application programming interfaces but do not consider how the fruits of their labour will be implemented in locales other than their own. This is probably due to a focus on supporting different manufacturer’ platforms, more than the people that use them.
With a lot of applications being developed in the USA, there is still too little consideration for the global user. This isn’t exactly a peculiarity of a internationalisation per se, because every platform has its qualification and quirk, but it is still a challenge nonetheless.
What is the most challenging in your work?
Changing the mind-set of development teams, convincing them of the real value of internationalisation, is a monumental task. The first barrier is deep-seated common misconceptions that inject ice cold fear into the heart of many developers when we utter the term localisation. It is fear of the unknown and self-doubt that ultimately cause a developer’s resolve to crumble.
I believe this challenge can be addressed, to a certain degree, by academic institutions. Schools, colleges, and Universities continue to be oblivious to the need for internationalisation when teaching computing and writing for an audience. They often fail to provide awareness of the world’s variety of cultures and how these cultures perceive and interact with systems. Systems that students may one day be developing themselves.
Can you recommend any best practices and tools for proper internationalisation?
Without any doubt, the Common Locale Data Repository (CLDR) is the most quintessential resource a developer should have in their bag of tricks. This thoroughly grounded library of locale data will help enormously in the development of truly global products, and use of programming libraries like the International Components for Unicode (ICU) will make implementation of the CLDR child’s play.
Adopting new technologies and standards, such as the Internationalisation Tag set 2.0 and HTML 5 will also make life much easier for developers and localisers alike.
Developers should also consider moving towards Best Current Practice 47 for finer-grained locale codes that better represent languages and cultures than the simple ISO 638-1 and 3166-1 codes for languages and countries respectively.
As for tools, the most important tool I have worked with is Globalyzer by Lingoport. This is a crucial piece of kit for any developer that performs internationalisation static analysis and captures almost all of the common internationalisation problems. Problems that would otherwise become apparent during product localisation, and ultimately impact the time and cost of software development.
How do you envision the future of localisation industry?
Today, the World Wide Web of information is predominantly accessible in the English language, but this will undoubtedly change. Why? Because advances in the way audio, visual, and textual content is linked to similar content in other languages will help to make all of this content even more available to the Internet. This is inevitable, and, to compound the matter, the vast amount of new information (and disinformation) being added to the Web on a daily basis, means there is going to be an ever-increasing desire to share it with the world. This is where we step in, but it isn’t going to be an easy ride.
Things are going to move very quickly when it does happen, and whilst the smaller localisation service providers are lean and agile, the larger, institutionalised, providers are going to be faced with some tough choices if they want the best seats in the show. They will need to develop a culture of adaptation and learn how change direction in a very short space of time. They will also need to know their limits because they will find it detrimental to their business if they attempt to accept every job. I also foresee a lot of new language service providers appearing on the scene. They will be more specialised, focussing only on smaller and more specific domains, such as social media or tourism.
This increase in demand will also impact developers, but there will be a greater understanding of the need for internationalisation. This will accelerate research into more effective internationalisation standards, and better integration of internationalisation into programming languages and software development tools. The result: seamless localisation and localisation workflows. Is this all pie in the sky? Perhaps.
But note that I haven’t mentioned machine translation yet. This is because the advances in machine translation will probably continue to be slow and painful. If we haven’t cracked it since the 1950s, then we are not going to crack it in the next sixty years because the human brain is a brilliant, yet at the same time wonderfully complicated, organ, and no machine will ever match it.
Many thanks for your insights, Gary!