tag:engineering.icarasia.com,2013:/posts iCarAsia Engineering 2017-10-06T21:36:36Z tag:engineering.icarasia.com,2013:Post/1196534 2017-10-06T21:36:01Z 2017-10-06T21:36:36Z A cup of tea with Ari

We have covered a lot of new things Ari has learned in the last few months. In fact, we have shared every step she took along the way.


Today I want to share something different with you. I am not going to indulge in the new features or anything of the sort.

I just want to highlight how much our girl has grown up.

Today in the afternoon, I took a cup of tea and put it beside my computer. It was a much needed break. I wasn’t doing test or analysis on Ari. I just wanted to have a quick conversation. Like a watercooler break.

And that’s what I did.

I didn’t know I could but it was one of the most pleasant conversations I’ve had with her.

I said, “Hi” and she replied with, “Hi there my friend”?

Not having a car-related query in mind, I just asked her, “How are you Ari?”

And she replied like any teenager beaming with youthful energy, her exact words; 

“Wonderful as always. Thanks for asking.”

My heart elated. It may not do much for you but for me this conversation was refreshing. I have seen all her baby steps to the point where she can converse in depth, including small talk.

I then logged off and entered another person’s name when ‘signing-up’. She remarked “Nice to meet you John” – she didn’t use to greet people like this before.

Moving on,

I was completely gripped in the conversation, so I decided to keep talking to her whilst having my tea.

My last chat with her was about Honda Civic 2017. It was still showing in our discussion. So straightaway I asked her, “Ari I want to buy Honda Civic 2017 as you know,”

She replied, “Let me get the Honda Civic 2017 details for you.”

Then the usual, show me this deal, show me that deal, compare this and compare that.

She kept helping me out. Showing me some amazing deals and car comparisons.

And that was that, I had finished my cup of tea.

I goodbye-d her and received a “goodbye” greeting in return.

So there it was, a nice pleasant cup of tea with Ari. I didn’t actually want car deals. I wanted a refreshment with my tea. And I got that. Just like a father would want to sit down and have a brief five minutes conversation with her daughter about her day, that is the feeling I got when I was talking to Ari.

Our little girl has grown up so much, it is hard to believe.

]]>
tag:engineering.icarasia.com,2013:Post/1196013 2017-10-04T17:56:13Z 2017-10-04T18:24:23Z 5 ways artificial intelligence is driving the automobile industry

Artificial intelligence seems to be taking the world by its throat. Automobile industry is no different, in fact, the car manufacturers are one of the leading researchers and visionaries in AI. All the major automobile companies are currently in a scuffle, trying to come out on top with their AI.

What makes AI so compelling in car industry is the fact that it learns and evolves from sensory inputs such as real sounds and images.  

So when AI is applied to an automobile, it starts picking on the environment. It starts learning and becoming better. It prepares for facing hurdles, applying new tactics, learning new things.

In 2015, there was only an 8% install rate of AI on vehicles. It is forecasted that in 2025 the same will shoot up to 109%. Needless to say 2025 will look very different as far as the automobile industry is concerned – artificial intelligence is the future.

The most common AI ‘stuff’ you hear and see on the tele are the self-driven cars. The idea has been around since 1970s and mankind’s fascination has grown with it.

Recently Ford have invested $1 billion in perfecting a self-driven car that could be on the roads by 2021. Not too many people are sure about the concept. Some argue that people love driving, cars are their passion and self-driven cars won’t be welcomed.

It is an interesting and thought provoking debate but let’s set it aside for another time.

Most people think that AI is limited to self-driven cars only and don’t realize that AI surrounds the modern non-self-driven cars already. But there are other segments to AI as well, other than driverless cars.

Let’s take a look at some of them. Here are some ways AI is driving and will be driving the auto industry in the future.  

A process called Machine Learning

Most people think that AI is Machine Learning whereas Machine Learning or commonly called ML, is a different thing. It is in fact a sub branch of AI. AI is directed at making machines act like human beings. Whereas ML is an application of AI where machines are given some data and they learn more and more things based on that data.

So instead of working like humans, machine work like machines but in a refined way. Recently Toyota research group said that they are working on systems that will help drivers who are not very good drivers. It will aid drivers during various processes, making driving easy for them.

Deep Learning

Deep Leaning or DL is a process through which Machine Learning is implemented. The DL breaks down the tasks into manageable segments. The DL software starts learning an applying new things, one at a time.

DL is increasing in leaps and bounds in the automobile industry. It is the technique that drives ADAS (advance driving assistance systems) and autonomous driving. DL not only deals with what is going inside the vehicles but what goes outside them as well. For example, the temperature, road conditions, driver’s health and mental fitness and various other things.

The internet of things

Internet of things has been a revelation in the automobile industry and it is looking like it will go even further. Modern cars have everything like smart sensors, GPS and various connectivity applications.

With such a revolutionized systems, here are some of the benefits that some cars manufacturers are giving their customers. In future, not only these processes be refined but also more and more cars are poised to have them,

  • Vehicle has a fully loaded firmware (software).  
  • That software is updated automatically without need to go to the manufacturer.
  • The vehicle gives a health report to its owner and the manufacturer and tells you when it needs repair and tuning.  
  • Better safety features such as intelligent breaking system where as car stops automatically if it seems it will crash into an object.
  • The manufacturing, maintenance and after sales service qualities have significantly improved due to real time performance data.
  • Here is one for the future, in case of medical emergency the car’s sensor would contact a medical center or help.

Cognitive capabilities

When it comes to cars, there are endless possibilities for cognitive learning capabilities. Imagine not a small ordinary car but a huge 18 wheel truck being self-driver. Would it know when to stop for gas?

Let’s even forget driverless element for a second. Would a truck realize it is being overloaded or properly loaded with material? Scientists and researchers are working on that. Instead of safety inspector guy, the truck itself will tell you if any rules are being violated.

How will this happen? This one is for the future. In a few years to come, there will be a system in cars that will have cognitive ability; the consciousness for self-preservation just like human beings. The machine will have a concern for its own and other people’s lives.  

If this doesn’t excite you, I don’t know what will.

Infotainment systems revamped

AI completely changes the scope of infotainment systems. These days both the software and censors of a car are compatible with an AI system. These systems include speech recognition software, eye tracking techniques, multiple language recognition and driving monitoring. Based on the data the car collects it will make life easier for you the driver.

Not only will it optimize the vehicle for you, it will make recommendations based on the data. For example, if you driver better right after having a coffee, it will judge the coffee in your system and monitor your driving performance. After analysis, it will alert you to try and have coffee every morning before you drive to work.

]]>
tag:engineering.icarasia.com,2013:Post/1195109 2017-10-01T13:52:52Z 2017-10-01T14:01:11Z Here is how AI will make your car very smart

Did you know that the installation rates for artificial intelligence programs will go from 9% in 2017 to 109% in 2025?

The artificial intelligence (AI) systems that are currently used in entertainment devices in cars, and advance driver assistance systems (ADAS) will rise from about 7.8 million to 122 million by 2025. This was predicted in a new report by IHS Technology. IHS Technology is the world's leading source for research, analysis, and strategic guidance in the technology, media, and telecommunications industries.

IHS recently published the Automotive Electronics Roadmap report. In the report they conclude that it is expected that the AI installation will rise multiple times in less than a decade. There will be many AI systems included in cars to do some of the very basic things. In fact the cars will have unbelievable safety features that will reduce accidents significantly.


According to Luca De Ambroggi, IHS Technology’s chief analyst for automotive, "An artificial-intelligence system continuously learns from experience and by its ability to discern and recognize its surroundings," he adds, "It learns, as human beings do, from real sounds, images and other sensory inputs. The system recognizes the car's environment and evaluates the contextual implications for the moving car."

In the coming five years or so, the AI systems will be divided into two categories mainly;

  • The first category is what scientists are calling the Infotainment. It includes various interesting features such as speech recognition, gestures, eye tracking, fingerprint scanners and language assistant systems.
  • The second category is ADAS (advance driver assistance systems), we mentioned it briefly at the beginning of the article. This category of AI deals with advance stuff like radars, advance GPS, vision systems, driver condition evaluation, controlling sensors and engine control units (ECU).

Now we move onto the interesting part…

Ambroggi believes that there is a great deal of vacuum. Since infotainment and ADAS are getting more and more complex, there is a need of advance software and hardware installations that will aid the entire process. The AI needs reinforcement in both hardware and software departments so it can function more and more like a human brain.

But we are on the right track. The IHS report says that ADAS programs are self-learning and ever-improving. It has notable advantages over traditional algorithms. The report predicts that ADAS is the key to having fully auto-driven vehicles.

Here is an extract from the report,

"For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems,"

A couple of years ago, a renowned market research organization reported that by 2020 there will be 250 million cars connected to a complex Wi-Fi system that will allow them to communicate with each other and various roadway systems.

The predictions made by Gartner say that right now there is a lot of information being fed into IVI units and telematics systems of cars. This already building a foundation for a very solid info-base. The learning process, the self-evolving algorithms have already been initiated.

Last year the CEO of Toyota Research Institute, Dr. Gill Pratt had some interesting inputs with regards to AI systems. Whilst speaking at New England Motor Press Association Conference at MIT, Pratt said that in coming years many driving features will be controlled by “guardian angels”, referring to ADAS.

He downplayed the idea of having fully autonomous vehicle that don’t need the driver. He believes that driving and cars are a passion that humankind is not ready to give up yet. He said that more research is being done to help drivers rather than making them obsolete.

"If you love to drive, the idea of a chauffeur is not fun," Pratt says. "Driver skills are ignored with a chauffeur; with guardian angel technology, you're augmenting human driving skills."

Pratt went onto explain the focus on his research and other pioneers of the industry. He explained that someday the car system will warn you many times, right before you begin driving it, about a dangerous habit you have that may interfere with your driving capabilities.

]]>
tag:engineering.icarasia.com,2013:Post/1195106 2017-10-01T13:39:26Z 2017-10-01T14:00:02Z Top 5 AI powered personal assistants that have become commonplace


One can argue that it all started with Siri, iPhone’s built-in phone assistant. Since then, everyone is looking to get on the Artificial Intelligence (AI) bandwagon. The growing advancement and fascination for AI has made it a very promising market segment.

Almost all the large tech companies have either jumped in or are ready to. Interestingly, the phone assistant powered by AI has become so commonplace that most people don’t know they are using AI.

Let me show you a very interesting illustration that sums up the AI scene. This is courtesy of the Artificial Intelligence Blog;   

Here are top five AI powered personal assistants that you should check out:

Amazon Echo

Amazon is one of the most evolving companies of our times. From selling books online to Kindles to the largest marketplace online that ever existed. They aim to carry their ambitions forward with Echo. The AI behind Echo called Alexa has been cleverly designed echo to be integrated with the company’s other products such as phones, tablets, cars and fridges. The best thing about Echo is that it is open source and can be enhanced by anyone.

When it comes to NLP (natural language processing), the thing you do when you speak to and order your personal assistant, Echo is falling short of the rest. However, the company realizes it and are determined to fix this complaint.

Google Assistant

The advantage with Google is that it is everywhere. They have various popular services and unsurprisingly Google Assistant seamlessly integrates with them all. When you own the largest mobile operating system, you are sure to make it count.

Google have made it clear that they want serious advancements in AI and personal assistants. You can expect the Google technology to advance leaps and bounds in the coming years.

Facebook

Here is a surprising entry. You may not have thought that Facebook’s AI assistant would make this list – or that Facebook AI assistant even exists. It doesn’t commercially exist at the moment but various inside sources confirm that Facebook is finalizing its AI component and it will be out shortly.

Why I included Facebook in this list when they do not have a personal assistant? There is a simple reason for that. Facebook is going to be a formidable player in the AI world. The amount of data they have is humongous compared to other players, even Google. Usage of Facebook is such as big part of our lives now that it only makes sense that its personal assistance AI would play a gigantic role as compared to other assistants.

Siri

I mentioned at the beginning of the article that Siri is the pioneer of phone assistant AI. But it has become disappointing after a while. It is getting better but it is more becoming clumsier. This has given an edge to Alexa and Google Assistance.

Probably one of the glaring problems is Apple’s strange choice of keeping its user’s data away from cloud. It means that Siri has to go through your data on your device not a cloud system.

Samsung

Samsung have not yet made the mark with their AI called Bixby as yet. However, you can be sure that Samsung would soon be a very big player in the AI circle. Bixby’s language support is only available in Korean right now. It will be available in English soon.

However, right now, Samsung has a lot of catching up to do.

]]>
tag:engineering.icarasia.com,2013:Post/1193130 2017-09-22T18:47:25Z 2017-10-01T14:00:17Z 5 surprising things AI can do better than humans

Artificial intelligence is growing rapidly. Debates are erupting on whether machines can be better than humans or not; the answer is both yes and no. No because machines will always be machines, they can never be living breathing humans. They consciousness will be mechanical – always.

But is that why we want artificial intelligence? For them to have humanly traits like love, talking, laughter, feeling, life and death? No. We want AI to push the mechanical boundaries that are always task oriented, and make life easy for us.

In that regard, yes AI can do better than humans.

Here are five ways how;

Do quicker web research

When you Google things, you know what happens? Google has a self-learning machine called RankBrain. It is an AI that handles queries. It works on the meaning of various words, phrases, their meanings, patterns and human behavior related to search queries.

When tested with human counterparts, humans showed 70% accuracy in guessing the true meaning of queries while RankBrain had 80% accuracy!

The system is not fully deployed yet, it works in collaboration with Google’s Hummingbird search algorithm. However, RankBrain gives a very good idea about things to come.

Work in impossible environments

Robots don’t need oxygen, they can survive climatic conditions we humans never could. Take deep space for example, a robot can collect and send data without worrying about habitability, sleep, food or other human requirements and sentiments. NASA’s rovers are doing exactly that.

Can you imagine a human being landing on mars or flying through endless space?

Scientists are working on making the space rovers more intelligent and self-sufficient. Recently researchers in UC Berkley successfully taught robots to do delicate stuff such as opening bottle caps, removing a nail from the wood, pouring and serving a glass of water etc.

Speak and translate all languages

Needless to say that a human being cannot learn more than a few languages, in fact most humans on earth and monolingual. AI, however can learn as many languages as there are on Earth. Google Translator is already making gigantic leaps.

Surely all of this is not 100% accurate at the moment, far from it in fact but we are headed in the right direction. Everyone knows that it is only a matter of time before human translators become obsolete.

Make error free calculations

Calculators are error free and let’s face it no human can calculate numbers as quick as a calculator can. Imagine if there is an advance thinking AI that can solve complicated equations quickly and un-erroneously, then sky is the limit.

Researchers are working on algorithms that allow AI to have a critical thinking process along with a super-fast calculating ability.

Just think how quickly the blueprints of skyscrapers will be completed or flight patterns be deduced. It is all about calculations and AI can do it better than us.   

Perform better surgeries

This one may come to you as a surprise but it is true. Robots don’t get tired. Robots don’t have physical fallibility. A robot equipped with proper AI can perform 1000 operations per day!

Some believe that doctors will become completely obsolete in 50 years or so. AI will be able to diagnose and treat medical illnesses with 100% accuracy.

It would be interesting to see that happen.

]]>
tag:engineering.icarasia.com,2013:Post/1190931 2017-09-13T11:37:16Z 2017-10-01T14:00:19Z Request a deal through Ari

Over the last few weeks, we’ve had so many updates from Ari that I don’t know where to begin.  I am quite puzzled with what to show you this week. Of course, I could put all the updates in one blog post but that would be too overwhelming, and where is the fun in reading a Wikipedia-like blog post – Unacceptable, because Ari is all about fun.

So let’s talk about a cool new feature Ari has for you.

You can request a car deal through Ari.

Here is how I did it:

I was interesting in checking out the new Honda Civic 2017 model so I went here; https://newcar.carlist.my/honda/2017/civic

On the top right corner I clicked on “Request a Deal” and Ari popped in from the right side of the screen. She asked me to enter my details, phone number, location, email etc. It sent me a verification code, a verification email.

It was nice to see that she is keeping the riff-raff away by going through the pains of verifying me, it seemed legitimate.

An instant chat session opened and I could talk to Ari face to face.

After getting myself verified, Ari threw at me a list of things I could do. Whether I wanted to visit my nearest dealership, a test drive, comparison, and even car loan pricing details. It was wonderful.

She gave me some useful tips for buying a car and then asked me if I she can suggest me a car based on my actions since landing on this page. I politely decline, said “no, thank you, I have my heart set on Honda Civic” – I really wanted to get me one of those.

Ari argued and with good reason, “Yes, a good buyer should always compare first before buying a car. But unfortunately I have limited space. I will lead you to comparison page where you can compare up to four cars. Just click on the image.”

I gave in and looked through various car comparisons on the website. I returned and Ari said “where are you?” and gave me names of all the cities of Malaysia, I selected my location and that’s that.

I was shown best car deals and just like that, I bought a beautiful Honda Civic 2017 in a color of my choosing.

As always, let us know your feedback, praises, anger and criticism, whatever you feel like; write in the comment section below.

Talk to us, we’re listening.

]]>
tag:engineering.icarasia.com,2013:Post/1181372 2017-08-07T15:49:17Z 2017-10-01T14:00:22Z Ari will now tell you everything – literally

Ari learnt something amazing and yet quite simple this week. Tell Ari to “show me all cars” and you will be directed to the Carlist.my website page with our entire inventory. Neat, right?

Everything in one place, all you need to see, and what else could you want?

Remember, Ari can give you highly specialized searches and comparisons, she’s already got that covered. Now all we needed was her to give us a more generic information.

I will take the opportunity this week to clear one or two misconceptions about artificial intelligence.

First let me give you my personal opinion on what I think of AI. I believe that AI should be as humanly as possible. If it were up to me, I wouldn’t create robots as all, I would instead create electronic humans.

We need not be scared of artificial intelligence

The first and foremost thing people ask in a discussion about AI, “is it safe? Will it take over the world like Skynet”,

OK, take a step back. AI in real life doesn’t look anything like sci-fi movies and neither does it function like one. AI is perfectly safe. AI is developed in humans’ image.

Nothing can replace human beings, According to Mark Zuckerberg, the founder of Facebook, AI is there to improve our lives and make things better and not take over the world.

And why would the robots EVER need to take over the world?

Artificial intelligence is actually more commonplace than you think

Many of us have already incorporated AI in our lives without even realizing it. Siri, Amazon Alexa, and Google Home all are example of intelligent personal assistants. Once again, I’d reiterate that AI is nothing like shown in the movies.

In fact the actual form of AI is far more useful in our daily lives but a bit anti-climactic when it comes to showmanship.

Anyways, coming back to Ari.

This week, can you tell us what feature of Ari do you use most?

That’s it from this week. We will be back next week with more scintillating Ari updates.

]]>
tag:engineering.icarasia.com,2013:Post/1179268 2017-08-01T12:05:30Z 2017-08-01T12:18:13Z Ari and advanced pricing filters

The most engrossing aspect of buying a car (or buying anything for that matter) is pricing. For most of us, price is the first thing we think of when we intend to buy something.

When it comes to buying cars, a smart person would look to get maximum out of his or her budget. This can be a confusing task. In order to make an informed decision, one must technically compare all the cars that fall into the desired price bracket.

Image via CBT

But that is virtually impossible? – not quite.

Oh I see, sure you can use some advance search tool with multiple search fields, you can get that list. However, I am not a big fan of traditional search forms. This is the age of artificial intelligence – you must get what you want by just typing one or two words.

Impossible is nothing  

In order to see each and every car that is falling in your price range, you just have to tell Ari “cars between 50K and 100K” – that’s it. You don’t even have to display common courtesy of “please” or “hi”, just get straight to the point, Ari doesn’t keep grudges. Ari is professional and has a very big heart.

Get what you want

I gave you an example of a fancy pricing range of 50K – 100K. However, when I was looking to buy a car, my budget was a less flattering 25K. I decided to look for the best option between 20K and 30K, not sure if Ari could give me anything in my modest range.

But voila! I am on the brink of buying Perodua AXIA 2017 – best choice I found through Ari's algorithms. 

I am glad that Ari is focusing on learning crucial aspects of the car dynamics. Pricing is something very primary. I personally think that it is a very big leap in an artificial intelligence system that she is teaching herself the all-important things. 

She understands car business so well, she knows what matters and what is secondary.

With such fascinating algorithmic thinking, I am beginning to trust Ari that she will make the best choice for me. I don’t worry about buying and selling cars anymore, I have everything at my fingertips thanks to Ari – come what may.

That’s it for this week, do let us know what you think of Ari’s progress in the comment section below.

We will be back with more Ari updates next week.

]]>
tag:engineering.icarasia.com,2013:Post/1177764 2017-07-27T16:21:15Z 2017-10-01T14:00:39Z Ari update: now you can compare different cars side by side

This week our brilliant Ari got even smarter. The ever self-learning wizard surprised us with yet another landmark: she can now give you comparisons between different cars in a heartbeat. All you have to do is say the word.

She is getting more flexible too, when hoping to get some cars’ details, I didn’t have to enter precise cars’ names. “Compare A4 and A6, please” and I got a link to a comparison page detailing the two Audi models.

Gone are the days when one has to go through the painful process of filling a lengthy form in order to get a quick (oxymoronic) comparison.

Hang on, we aren't done yet…

But hold on, Ari is not done giving you the comparison. When she gives you the comparison link she also asks “Want me to add another car to the comparison page?”

Again you only have to enter the car’s name and you will get another link. Now you can compare three cars side by side in great detail, with virtually every possible information about the cars.

In mere seconds you can get all the comparisons you need.  

From expert ratings and engine specifications to fuel consumption and pricing, it's all there.

But why compare different cars?

Whenever I am buying a car, I like to browse through options. My advice to you is; always look through different choices. You should never buy a car without proper research. More often than not you will find that a particular car that looks great in pictures is lacking in specifications that you’d like to have.

In today’s technology dominated world, it makes no sense to visit a car retailer to look at different models. You can do it online. Even online you don’t have to fill out search forms. These are great times to be alive.

How Ari serves you

The thing about artificial intelligence is to give you information that you seek in a more personal way. That is step one. Step two is give you information that you need but haven’t asked for.

There are many things circling your mind that you cannot pen down when doing research. Ari provides you with a set of options, she thinks like a human, she thinks like you.

She will give you recommendations based on your needs. She will identify what else you might be interested in, based on your searching mindset.

Developing AI has one goal only; to eliminate machine-like output. To give you personalized recommendations based on information you provided along with intelligent guesses.

That’s all for this week, have a great time Ari-ing your way to car searching like never before.

If you have any ideas, suggestions and recommendations for Ari, any criticism, do let us know in the comment section below. We’d love to hear from you. It has always been about you and only you.

See you next week with more Ari updates. Adios.

]]>
tag:engineering.icarasia.com,2013:Post/1176845 2017-07-25T10:01:57Z 2017-07-25T10:04:46Z How to automate the Facebook, Google and Fabric SDK setup for different environments in iOS apps

When building mobile apps, it is very common to integrate Facebook, Google or Facebook SDKs. Managing configurations for these SDKs across different environments can become tedious and error prone. Configurations like setting the app key in info.plist (or GoogleService-Info.plist) sit outside of the scope of the code and cannot be handled programatically. As a consequence, we often end up managing the files manually and … making mistakes.

At iCarAsia, we use 4 environments: stack, staging, prepared and production. Everytime we build for one of these environments we need to change the app keys and URL scheme for Fabric, Facebook and Google. In order to limit the chances of error, we decided to automate the process.
To automate the configurations of these SDK’s following changes are required as per their documentations.

To change the facebook, google and fabric SDK’s settings for different environments following things needs to be modified each time build is required.

Facebook:

Setting the FacebookAppID in the info.plist

Setting the FacebookDisplayName in the info.plist


Facebook_info


Setting the FacebookAppID with fb prefix as the URL scheme in the info.plist

 

Facebook_URL_scheme


Google:

Google has its own GoogleService-Info.plist. its TRACKING_ID needs to be replaced in that file


Google_plist


Fabric:

Setting the APIkey in info.plist under Fabric key.


Fabric


Automation:

To automate this process we are using shell scripts which can easily be integrated in the xcode build process using the Target -> Build phases -> New run script phase


Adding tun script pahse


To invoke any script just place a call to that script file in the shell script placeholder like for example for  facebook it is the folowing code snippet in our project

1
2
# Setup Facebook Kit API Keys
. ${PROJECT_DIR}/FacebookKeyAutomationScripts/Setup_Facebook_API_Keys.sh

which in the xcode project look like this


Facebook script

 

Note: Just make sure this script phase is after the “compile sources”and “copy bundle resources” phase. So that files which needs to be modified are already copied to the target bundle.


Facebook script phase order


Now turning to the actual code (script) which modifies the files,  Since the concept of writing the script is same for every SDK so lets use the example of the facebook for illustration purpose.

Scripts:

To start off we always create two shell script files


Facebooj Script files


  1. Keys file : Defining the keys for different environments
  2. Automation logic file : The script to change the keys in the build depending on the build environment.

1.    Keys file

This files contains all the keys for the different envirnoments. For us its staging, stack, preprod and production.

#!/bin/sh
# Facebook_keys.sh
# iOSConsumerApp
#
# Created by Muhammad Tanveer on 9/17/15.
# Copyright (c) 2015 iCarAsia. All rights reserved.
if [ ${TARGET_NAME} = "iOSConsumerApp" ]; then
the_facebook_production_api_key="XXXXXXXXX"
the_facebook_production_display_name="Carlist - Production"
the_facebook_production_url_scheme="fbXXXXXXXXX"
the_facebook_preprod_api_key="XXXXXXXXX"
the_facebook_preprod_display_name="Carlist - Preprod"
the_facebook_preprod_url_scheme="fbXXXXXXXXX"
the_facebook_staging_api_key="XXXXXXXXX"
the_facebook_staging_display_name="Carlist - staging"
the_facebook_staging_url_scheme="fbXXXXXXXXX"
the_facebook_stack_api_key="XXXXXXXXX"
the_facebook_stack_display_name="Carlist - Stack"
the_facebook_stack_url_scheme="fbXXXXXXXXX"
#indonesia app keys....
elif [ ${TARGET_NAME} = "iOSConsumerApp-ID" ]; then
the_facebook_production_api_key="XXXXXXXXX"
the_facebook_production_display_name="Mobil123 - Production"
the_facebook_production_url_scheme="fbXXXXXXXXX"
the_facebook_preprod_api_key="XXXXXXXXX"
the_facebook_preprod_display_name="Mobil123 - Preprod"
the_facebook_preprod_url_scheme="fbXXXXXXXXX"
the_facebook_staging_api_key="XXXXXXXXX"
the_facebook_staging_display_name="Mobil123 - Staging"
the_facebook_staging_url_scheme="fbXXXXXXXXX"
the_facebook_stack_api_key="XXXXXXXXX"
the_facebook_stack_display_name="Mobil123 - Stack"
the_facebook_stack_url_scheme="XXXXXXXXX"
fi
view rawFacebook_keys.sh hosted with  by GitHub

Another benefit of using this file is that if you have different targets than you can differentiate the keys for different targets. Like we have three different targets for three countries and we use that to differentiate the keys which will be used in the second script.
${TARGET_NAME} is one of the variable available in build settings which can be used to differentiate the targets in xcode project.  For the complete list of available build settings please refer to this apple page.

 

2.   Automation’s logic file

The second phase is to use the keys and perform the required changes according to the SDK requirements. Here is the script to do that

#!/bin/sh
# Setup_Facebook_API_Keys.sh
# iOSConsumerApp
#
# Created by Muhammad Tanveer on 9/17/15.
# Copyright (c) 2015 iCarAsia. All rights reserved.
path_to_info_plist_file="$TARGET_BUILD_DIR/${PRODUCT_NAME}.app/Info.plist"
# Import keys and secrets from a file
. ${PROJECT_DIR}/FacebookKeyAutomationScripts/Facebook_keys.sh
if [ ${CONFIGURATION} = "AppStore" ]; then
echo "Release Build Configuration - Set Facebook API Key to Production"
the_current_facebook_api_key=`/usr/libexec/PlistBuddy -c "Print :FacebookAppID" "$path_to_info_plist_file"`
echo "Current Facebook API Key from Info.plist: $the_current_facebook_api_key"
echo "Facebook Production API Key: $the_facebook_production_api_key"
if [ "$the_current_facebook_api_key" == "$the_facebook_production_api_key" ]
then
# Keys match - do not change
echo "Facebook API Keys match. Will not update"
else
# Keys do not match - will change
echo "Current Facebook API Key is not the same as new API Key, will change"
/usr/libexec/PlistBuddy -x -c "Set :FacebookAppID $the_facebook_production_api_key" "$path_to_info_plist_file"
/usr/libexec/PlistBuddy -x -c "Set :FacebookDisplayName $the_facebook_production_display_name" "$path_to_info_plist_file"
# Assuming the URl scheme for facebook is the first one in url schemes array
/usr/libexec/PlistBuddy -x -c "Set :CFBundleURLTypes:0:CFBundleURLSchemes:0 $the_facebook_production_url_scheme" "$path_to_info_plist_file"
the_updated_facebook_api_key=`/usr/libexec/PlistBuddy -c "Print :FacebookAppID" "$path_to_info_plist_file"`
the_updated_facebook_display_name=`/usr/libexec/PlistBuddy -c "Print :FacebookDisplayName" "$path_to_info_plist_file"`
the_updated_facebook_url_scheme=`/usr/libexec/PlistBuddy -c "Print CFBundleURLTypes:0:CFBundleURLSchemes:0" "$path_to_info_plist_file"`
echo "Facebook API Key set to: $the_updated_facebook_api_key"
echo "Facebook Display Name set to: $the_updated_facebook_display_name"
echo "Facebook URL Scheme set to: $the_updated_facebook_url_scheme"
fi
elif [ ${CONFIGURATION} = "Release" ]; then
echo "AdHoc Build Configuration - Set Facebook API Key to Preprod"
the_current_facebook_api_key=`/usr/libexec/PlistBuddy -c "Print :FacebookAppID" "$path_to_info_plist_file"`
echo "Current Facebook API Key from Info.plist: $the_current_facebook_api_key"
echo "Facebook Preprod API Key: $the_facebook_preprod_api_key"
if [ "$the_current_facebook_api_key" == "$the_facebook_preprod_api_key" ]
then
# Keys match - do not change
echo "Facebook API Keys match. Will not update"
else
# Keys do not match - will change
echo "Current Facebook API Key is not the same as new API Key, will change"
/usr/libexec/PlistBuddy -x -c "Set :FacebookAppID $the_facebook_preprod_api_key" "$path_to_info_plist_file"
/usr/libexec/PlistBuddy -x -c "Set :FacebookDisplayName $the_facebook_preprod_display_name" "$path_to_info_plist_file"
/usr/libexec/PlistBuddy -x -c "Set :CFBundleURLTypes:0:CFBundleURLSchemes:0 $the_facebook_preprod_url_scheme" "$path_to_info_plist_file"
the_updated_facebook_api_key=`/usr/libexec/PlistBuddy -c "Print :FacebookAppID" "$path_to_info_plist_file"`
the_updated_facebook_display_name=`/usr/libexec/PlistBuddy -c "Print :FacebookDisplayName" "$path_to_info_plist_file"`
the_updated_facebook_url_scheme=`/usr/libexec/PlistBuddy -c "Print CFBundleURLTypes:0:CFBundleURLSchemes:0" "$path_to_info_plist_file"`
echo "Facebook API Key set to: $the_updated_facebook_api_key"
echo "Facebook Display Name set to: $the_updated_facebook_display_name"
echo "Facebook URL Scheme set to: $the_updated_facebook_url_scheme"
fi
elif [ ${CONFIGURATION} = "Debug" ]; then
echo "Debug Build Configuration - Set Facebook API Key to Development"
the_current_facebook_api_key=`/usr/libexec/PlistBuddy -c "Print :FacebookAppID" "$path_to_info_plist_file"`
echo "Current Facebook API Key from Info.plist: $the_current_facebook_api_key"
echo "Facebook Development API Key: $the_facebook_staging_api_key"
if [ "$the_current_facebook_api_key" == "$the_facebook_staging_api_key" ]
then
# Keys match - do not change
echo "Facebook API Keys match. Will not update"
else
# Keys do not match - will change
echo "Current Facebook API Key is not the same as new API Key, will change"
/usr/libexec/PlistBuddy -x -c "Set :FacebookAppID $the_facebook_staging_api_key" "$path_to_info_plist_file"
/usr/libexec/PlistBuddy -x -c "Set :FacebookDisplayName $the_facebook_staging_display_name" "$path_to_info_plist_file"
/usr/libexec/PlistBuddy -x -c "Set :CFBundleURLTypes:0:CFBundleURLSchemes:0 $the_facebook_staging_url_scheme" "$path_to_info_plist_file"
the_updated_facebook_api_key=`/usr/libexec/PlistBuddy -c "Print :FacebookAppID" "$path_to_info_plist_file"`
the_updated_facebook_display_name=`/usr/libexec/PlistBuddy -c "Print :FacebookDisplayName" "$path_to_info_plist_file"`
the_updated_facebook_url_scheme=`/usr/libexec/PlistBuddy -c "Print CFBundleURLTypes:0:CFBundleURLSchemes:0" "$path_to_info_plist_file"`
echo "Facebook API Key set to: $the_updated_facebook_api_key"
echo "Facebook Display Name set to: $the_updated_facebook_display_name"
echo "Facebook URL Scheme set to: $the_updated_facebook_url_scheme"
fi
fi

First we define the path to the info.plist of the target at line 9 . Then import the keys files defined in step 1 to this scope so that we can use it.

Next we see if the current configuration for the project is “AppStore”. If it is indeed the AppStore build than we get he current key using the helpful utility plistbuddywhich is available on mac to modify and access the plist files.

We than compare the current key with the facebook production key. If the keys matches than there is no need to change anything as the last configuration scheme for the build process was same as the current one as keys are already set properly otherwise we have to update the FacebookAppID, FacebookDisplayName and URL scheme values which is done from line 30-33.

Note:

Important thing to mention is this case URL scheme is the first element in the URL schemes array. If your URL scheme for is at second or any other position use that position like “:positionIndex:” insetead of “:0:” in the script.

After that to confirm that the values are changed correctly we get the latest values from the plist and print it to console.

You can confirm the values from the Report Navigator (cmd +8) of the xcode project.


Script Output

 

The script below line 67 is same as the explained for with different keys.

Same kind of scripts can be written for fabric and google. For you reference the links of these scripts for our app are added at the bottom.

There is one extra step for fabric  that is to upload the dsym build to the fabric which can be added as another build phase to run another script. The script for that is

#!/bin/sh
# Run_Crashlytics.sh
# iOSConsumerApp
#
# Created by Muhammad Tanveer on 9/17/15.
# Copyright (c) 2015 iCarAsia. All rights reserved.
# Import keys and secrets from a file
. ${PROJECT_DIR}/Crashlytics_build_phase_run_scripts/Crashlytics_keys.sh
if [ ${CONFIGURATION} = "AppStore" ]; then
echo "Running Crashlytics for this build"
echo "Will upload to production Organization"
"${PODS_ROOT}/Fabric/Fabric.framework/run" $the_crashlytics_production_api_key $the_crashlytics_production_build_secret
elif [ ${CONFIGURATION} = "Release" ]; then
echo "Running Crashlytics for this build"
echo "Will upload to development Organization"
"${PODS_ROOT}/Fabric/Fabric.framework/run" $the_crashlytics_development_api_key $the_crashlytics_development_build_secret
elif [ ${CONFIGURATION} = "Debug" ]; then
echo "Not Running Crashlytics for this build"
else
echo "Not Running Crashlytics for this build"
fi
view rawRun_Crashlytics.sh hosted with  by GitHub

Which just invokes the fabric run command as they have described in their docs. We just don’t run it for the development build as we don’t want to upload the DSYM file for crashes during the development. But for QA and production builds it get uploaded automatically.

The gists of other scripts can be found at the following links.

Facebook_keys.sh , Setup_Facebook_API_Keys.sh

GoogleAnalytics_keys.shSetup_GoogleAnalytics_API_Keys.sh

Crashlytics_keys.shSetup_Crashlytics_API_Keys.sh

Conclusion:

Automating the key change process for different environments and countries had a great effect on our workflow. We can now work with confidence, knowing the right keys will be used for each environment. It also saved us significant time on the tedious and error prone task of configuring SDKs.

We will now work on automating the build distribution process for QA and Apple. So that, for instance when QA needs a specific build for a specific country, the correct app is built and sent with one command.

]]>
tag:engineering.icarasia.com,2013:Post/1176843 2017-07-25T09:59:44Z 2017-07-25T10:15:24Z Elasticsearch – Part 1 – Why we chose it?

Understanding our searches and listings.

End of last year, we started working on ways to understand our visitors and one of the things we were interested in, was to understand what they searched and how often they found what they were looking for. On the other side of things, we also wanted to know what happens to the listings our sellers posted, things like how often they show up and in what search queries. Then there were things like, what are the most popular car makes and models, in which regions, and other patterns that we may find. In the end we wanted to correlate all this to understand what is happing on our sites, from the individual listings to the bigger picture. Eventually this will help us to improve our systems, and later integrating this data within the sites may provide an improved experience for both buyers and sellers.

We started exploring ways we could store this data. Getting the data was the easy part as what the user searches comes to our internal API. The challenge was storing the data and in a manner that later on not only be used to understand but integrate back into our system.

There were two important questions we asked ourselves. One was where to store the data and the second one was how to make it meaningful?

 

So how do we store this data?

As we already were using Apache Solr as the search engine for our sites, our first thought was to somehow enable logs in Solr and get those logs into a format, which we could analyze. 

elasticsearch1

On our search for something that did this, we came upon the ELK (Elasticsearch, Logstash, Kibana) stack, which almost sounded like what we wanted.

The Logstash would take the Solr logs and dump them into Elasticsearch, the Elasticsearch would allow us to query it, and Kibana would use the Elasticsearch to graph it. Elasticsearch is a search server based on Lucene which provides distributed, multitenant-capable full-text search engine with an HTTP web interface and schema-free JSON documents. Elasticsearch is written in Java and is open source using the Apache License.

We did a run the ELK using Solr logs. It did work but then came the inflexibility. Logs contained only what the user searched but what it didn’t have is where the query generated and other data we may need. Also the other problem was, how to get what the visitor saw as a result of the search. That would require post processing as we would have to pick the search query and fill it with result at a later time.

We put aside ELK for now and started looking for alternatives. We went doing so, ruling out RDBMS, by starting to explore some of the nosql databases and other post processing technologies. We went through MongoDB, Hadoop, HIVE, Cassandra and VoltDB.

es2

 

One of the solutions that we worked out was involving Cassandra. As Cassandra benchmarks were the best, with the high number of writes in less time and it’s compressing of data on storage therefore requiring less disk space, it seemed almost the thing we needed.

We first created a basic schema, creating collections, for Cassandra and wrote an API endpoint to write some dummy data into it. Then we ran basic load tests using JMeter and storing the data. The writes were great, and the data space taken by Cassandra was low. But while implementing this, we felt we were reworking the schema and rethinking about what we want to store, changing then implementation. One of the thing that was a bit bothering is the post processing we might have to do if we chose Cassandra. As the data would be raw format, we would have to create usable data, working and changing the schema initially to get to the point where we have the desired result. Plus then do post processing on it. Since we are only beginning to explore how the data we wanted, could be used, we needed something that would not require less processing and would bring our data into a format which we can query for aggregations and perform analytical queries on.

Elasticsearch

es3

We then went back to ELK stack, and decided, we didn’t need Logstash. So just started doing a similar process above for testing, but just using standalone Elasticsearch for now. From our experience with Cassandra, we know the first thing to do was to finalize the fields from the search query we wanted to save, and what data from the search result we wanted to record for each individual search query. All this is anonymous data but having this decided early on was a plus.

We did a rough calculation using the numbers from our New Relic and Google Analytics and came up with a rough number of requests we were anticipating. We then wrote scripts to populate dummy data into Elasticsearch and see the size it takes to store documents (with each document containing average number search parameters and one search result). We had an estimated size of data that we would get but what about the load? So we started by sending concurrent write requests  to the server. We started off with JMeter to create the concurrent requests. With limited success, we were able to test it. Unlike Cassandra, which was on our local server, Elasticsearch we had deployed on an AWS machine. So we ran into a bandwidth bottleneck while running testing. So we decided to run the benchmark form another AWS machine, we had on the network with the Elasticsearch machine. During this time, we moved out of JMeter and started using Apache Benchmark for the concurrent tests. And this is when we decided to go with Elasticsearch. Elasticsearch was easily managing the number of writes we estimated and the data we saw was easy to query. The only concern we saw was the disk size. Our initial assessment showed, 2 TB of data for 3 months (if we had one search query with one search result), which would be a lot more as the search results are normal 10 at average.

es4

 

Then we came to the question of how to get the data in the Elasticsearch. Of course the API directly writing to Elasticsearch was the easiest solution. But there were three concerns in this.

First was we didn’t want our API endpoints doing extra work and slowing down. Secondly, in case of some delay in Elasticsearch write, we didn’t want our API endpoint to slow down. Thirdly, in case of error in Elasticsearch, we didn’t want it affecting the original endpoint and also have a certain retry mechanism.

To avoid all this, we decided to let our Queue server (Fresque) do the writing to Elasticsearch. Our API would just create a job with the search query and forget about it, without having to do anything else. It was the job’s task to generate the Solr results, process the search query parameters, and do any post processing work and then save it into Elasticsearch. This would ensure our normal site would function as is, but the load would shift to the queue server. I’ll discuss Fresque in detail and other load testing related things in the Part 2 of the article.

Why Elasticsearch and not Solr?

There is the question that was nagging, in our minds, why didn’t we just go ahead and use Solr. Its also based on Apache Lucene, and has great search feature, and we already have experience managing it. Well the answer lies in what we needed in this case. We chose Elasticsearch cause of how it indexes data, the analyzers we can use and also the ability to use the nested and parent-child data in it, but we mainly chose it because of the analytical queries it can do.

Solr is still much more for text search while Elasticsearch tilts more towards filtering and grouping, the analytical query workload, and not just text search.  Elasticsearch has made efforts to make such queries more efficient (lower memory footprint and CPU usage) in both Lucene and Elasticsearch.  Elasticsearch is a better choice for us as we don’t need just text search, but also complex search-time aggregations.

The way ElasticSearch manages shards / replication is also better than Solr, as it’s a native feature within it and more control but we didn’t put that in the consideration, although that itself is a good reason.

Conclusion

So in the end, we went with Elasticsearch, compromising on the high data size, for it’s ability to aggregate data and make the data searchable and also enable us to perform analytical queries, reducing the effort to process data. Elasticsearch can transform data into searchable tokens with the tokenizer of our choice and perform any transformation on it and then index the needed fields. It also supports both nested objects and parent-child objects, which is a great way to make sense of complex data. Then there is the wonderful Kibana. It can plot graphs using ElasticSearch and give us instant meaning.

 

Next up

Elasticsearch – Part 2 – Implementation and what we learned.
Elasticsearch – Part 3 – A few weeks fast forwarded and the way ahead.

]]>
tag:engineering.icarasia.com,2013:Post/1176840 2017-07-25T09:53:57Z 2017-07-25T09:54:19Z iCar Asia Product And Technology Hackathon Day

Winter is coming.. We are going to have a Hackathon day.. For some reason, both of these sentences meant the same to the fun peeps of iCar Asia Product and IT team. Maybe because this idea was over-discussed and never actually happened (for the year 2015); just like the brothers of the Night’s watch were told too much about the White Walkers. And it was just a myth for the brothers until they actually saw the White Walkers raiding the Hardhome as the Free Folk boarded ships bound for the Castle Black [Game Of Thrones Season 5]. So, we the Product & Technology culture committee members: Faraz, Divya, Yi Fen, Syam, and Salam (myself), made sure the equivalent of the White-Walkers-Raid happened at iCar so that it couldn’t stay a myth anymore. Yes, I’m talking about arranging the Hackathon day for our team.

After ‘how and when are we going to arrange it’ discussion in the culture committee meeting, this is the email I sent out to the team on the 15th April 2015.

Hi Team,

As you all know, we have been discussing about the ‘Hackathon‘  for quite sometime now, let’s actually do it.

The culture committee, as a team, has agreed on making it happen next ‘Tuesday 21st April 2015’ and Joey – our beloved CIO has approved it too. So get your turbo-creativity charged: you’re gonna need it.

There are few rules which we are gonna share with you later this week. The basic idea is to start the Hackathon officially on Monday evening 5 PM: you can form a team, think about the idea, and start working on it on Monday itself (after 5 PM). You can work on your ‘great idea’ until Tuesday 5 PM.

After 5 PM Tuesday, each team (turn by turn) will present whatever they have worked on and then ideas will be ranked based on a preset criteria (Which we’ll share later with you).

Don’t forget, there are prizes too (for the first and the second best teams).

Get ready folks: Winter is coming ;).

Thanks.
Salam Khan

There was a mixed feedback about the email. Many thought it’s just another promise email and nothing is going to happen, however the push and the feel from the culture committee team made them feel that it’s real and not just another promise.

Hackathon Guidelines / Rules

To emphasize on the idea of this hackathon being real, very next day I wrote these guidelines, discussed with the culture committee and shared with everybody in the Product and Technology team. There are some points which were taken as fun but when explained, team agreed to follow those.

Read to enjoy :).

Team Guidelines

  1. Each team must consist of more than 2 members but not more than 5 (follow the Hipster, Hacker, and Hustler approach)
  2. Syam and David cannot be in the same team
  3. Manju and Faraz cannot be in the same team
  4. Arvind and Tanveer cannot be in the same team
  5. Joey and Pedro cannot be a part of any team
  6. Any team cannot consist of more than 2 .NET devs
  7. Any team cannot consist of more than 2 PHP devs
  8. Any team cannot consist of more than 2 QA
  9. Syaiful and Juliana cannot be in the same team
  10. Alain, Geetha, and Celine cannot be in the same team
  11. Sonny and Jackson cannot be in the same team
  12. Albert and Salam cannot be in the same team
  13. Team can be formed anytime now until Thursday 5 PM but the actual work must not start before that
  14. Teams will have 24 hours – From 5 PM Thursday 23rd April 2015 To 5 PM Friday 24th April 2015 to work on their idea
  15. Teams can spend 24 hours in the office if they want to
  16. The output of a team can or cannot be a working software. It can be a prototype, a software, or even a presentation
  17. There must not be any single P&T members left without being a part of the team

Jury and the general rules

  1. Joey and Pedro (and the overall clapping for each team)
  2. Ideas will be rewarded on the base of:
    1. Innovation and creativity
    2. Impact on society
    3. Market viability
  3. Each team will get 5 to 7 minutes (not less than 5 minutes and not more than 7 minutes)
  4. No drug or creativity-enhancing stuff (other than Redbull and Coffee) can be used throughout the Hackathon

Prizes

  1. First team gets Raspberry pi 2 Model B (for each member)
  2. Second team gets iFlix Annual Subscriptions (for each member) for 2015
  3. All teams will get a certificate of Hackathon participation (for each member)

First draft by Salam, approved by the Culture Committee, and Joey.

We told everybody they only have one day to form their teams and day after tomorrow (from April 22nd) is the Hackathon day. And this time we asked the culture committee members to make sad or angry faces. We did that. And it actually worked.

Hackathon Teams

Within next day we had these 4 teams formed.

ATAMS (Pronounced as ATOMS)

  1. Alain
  2. Tanveer
  3. Ashok
  4. Mayur
  5. Salam

HEAVYWEIGHT (Yeah, most of them are heavyweight indeed)

  1. Syaiful
  2. Bob
  3. Manju
  4. Syamsul
  5. David

The Winning Team (It doesn’t mean they won or something ;))

  1. Fahad
  2. Zeeshan
  3. Wei Fong
  4. Juliana
  5. Celine
  6. Jackson

Juz Bananas (Yeah, whatever, they won!)

  1. Faraz
  2. Shahzad
  3. Daniel
  4. Yi Fen
  5. Arvind
  6. Lakshami

Team Projects.

It really happened. Every group worked very hard and used their creativity to innovate something new. Team Projects were as follows.

Carlist Desktop Chat project

Carlist-Chat-Project

(Langkawi) Travel Mobile App

iCar-Hackathon-Plan-My-Trip

CanCan Lunch App

CanCan-App

Carlist Desktop – One Stop Shop for buyers and sellers

One-Stop-Shop-Carlist

And the first and the second prizes went to…

All four ideas were really appreciated by the Jury and the audience (business people from other departments). But at the end the number one idea was the ‘Chat Project for Carlist‘ which won the Jury’s hearts, followed by the ‘Chat App‘ winning the second prize.

Random clicks

Here you go, some random clicks from the Hack Day.

iCar-Hackathon-Day-1 2 Small
iCar-Hackathon-Day-2 Small
iCar-Hackathon-Day-3 Small
iCar-Hackathon-Day-4 Small
iCar-Hackathon-Day-5 Small
iCar-Hackathon-Day-6 Small
iCar-Hackathon-Day-7 Small

In the end, I would like to use this platform to thank everybody (current and ex alike) at iCar Asia Product team who helped us arrange this amazing hack day. As we all believe that “a journey of a thousand miles begins with a single step” and the first step is always the toughest one, I hope that more of these Hackathons / Hack Days will keep happening at iCar Asia and the fun peeps at Product and Technology team will keep innovating.

Cheers.

]]>
tag:engineering.icarasia.com,2013:Post/1176831 2017-07-25T09:35:07Z 2017-07-25T09:40:22Z Migrate Old URLs to New URL Structure Using Nginx and Redis.

While maintaining a website, webmasters may decide to move the whole website or parts of it to a new location. For example, you might have a URL structure which is not SEO or user friendly and you have to make it one. Changing URL structure can involve a bit of effort, but it’s worth doing it properly.

It’s very important to redirect all of your old URLs traffic to the new location using 301 redirects and make sure that it’s possible to navigate it without running into 404 error pages.

To start with, you will need to generate a list of old URLs and map them to their new destinations. However, this list can grow bigger and bigger depending on the size of your website. Storing this mapping also depends on your servers and size of the website URLs. You can use a database or configure some URL rewriting on your server or application for common redirect patterns.

The problem with database is that it is slow, while file based mapping (by nginx) can take long time just to reload or restart nginx (i.e. need reload or restart nginx in case you add more redirect rules) and also take significant amount of memory depending on size of the mapping file.

Nginx  Redis - Migrate Old URLs to New URL Structure

Nginx + Redis – Migrate Old URLs to New URL Structure

Fortunately, by using Redis and Nginx Lua Module you can make this transaction smooth and the overall migration process – painless.

Requirements:

1 – Install packages nginx-extras & redis-server (http://www.dotdeb.org/instructions/)
2 – Install http://openresty.org/download/ngx_openresty-1.2.4.14.tar.gz
3 – Configure nginx
+ Add the following line at the start of nginx file (replace path with the proper location where you installed openresty module).


lua_package_path "/usr/local/openresty/lualib/?.lua;;";

4 – Add following location block in nginx file :


location ~ "^/[\d]{4}/[\d]{2}/[\d]{2}/(?<slug>[\w-]+)/?$" {

content_by_lua '

local redis = require "resty.redis"
local red = redis:new()

red:set_timeout(1000) -- 1 sec
local ok, err = red:connect("127.0.0.1", 6379)
if not ok then
ngx.exit(503)
return
end

local key = ngx.var.slug
local res, err = red:get(key)

if not res then
ngx.exit(404)
return
end

if res == ngx.null then
ngx.exit(404)
return
end

ngx.redirect(res, 301)
';

}

 

How does it work?


lua_package_path "/usr/local/openresty/lualib/?.lua;;";

This line is to tell nginx to load lua module as you are intended to use lua script in you configuration.


location ~ "^/[\d]{4}/[\d]{2}/[\d]{2}/(?<slug>[\w-]+)/?$"

This line is to check all requests with old URL pattern to fall under this block with a lua variable (i.e. slug)


local redis = require "resty.redis"
local red = redis:new()

red:set_timeout(1000) -- 1 sec
local ok, err = red:connect("127.0.0.1", 6379)
if not ok then
ngx.exit(503)
return
end

Above lua script will try to connect with Redis server (on host 127.0.01 and port 6397) with 1 second timeout.


local key = ngx.var.slug
local res, err = red:get(key)

Above lua script will get key from Redis (i.e. usgin lua variable slug which we got from regex)

Rest is quite self explanatory as it will will redirect with 301 if found or with 404 if not.

+ NOTE: In example above replace regex, redis server host & port according to your need
a. Above regex is for URL pattern /{year}/{month}/{day}/slug
b. Redis server path (i.e. host 127.0.0.1) and port (i.e. 6379)

Know another or perhaps a better way  to migrate old URLs to new URL structure? Or have used the same method for your website’s URL migration? Share your experience with us through comments. We are always happy to hear from you.

]]>