Blog

V.me Digital Checkout

iCODONS is now providing technology implementation for the Visa Inc’s new digital wallet service V.me. V.me service is now available in US and Canada. By end of this year, Visa will be rolling out V.me service for selected European and APAC countries.

In terms of core offering, V.me is not very different from PayPal. V.me offers some advantages over PayPal in terms of user experience, merchant services and more importantly fraud prevention. But V.me adaptation is biggest challenge for Visa.

On adoption front, Visa is taking a different go to market strategy for V.me. V.me adoption is a chicken-egg problem. Visa is trying to tackle this on two different ways. They are directly partnering with major banks and financial institutions (13 of the top 25 banks in the U.S are now supporting V.me). Getting banks onboard means your next Visa card will come V.me enabled. This will give a massive head-first start for the V.me adaptation.

Second is getting merchants onboard by offering incentives for being early adopters. As far as we have heard the incentives for early adopters are massive. For early adopters Visa is offering

  • very low standard/transaction fee for first few years
  • initial investment/support for V.me development
  • access to mobile payments in early stage

In terms of merchant adoption Visa is focusing online first but Visa is also working with technology partners such as Samsung and HTC to bring next generation NFC-enabled mobile payment options. More than 200 US retailer including online biggies OverStock.com, LivingSocial, Newegg are already onboard - some of them are already live others are in last phase.

For developer integration of V.me into your e-commerce platform Visa is offering Application programming interfaces (API), software developer kits (SDK) via Cybersource and Authorize.net. So if you are running an e-commerce system which is already integrated with Cybersource and Authorize.net chances are you will be up and running in less than 30 days.

So talk to us about V.me integration with your existing or new e-commerce platform.

frictionless e-commerce

Frictionless e-commerce – yet another overused marketing phrase these days. To be honest this is not really a marketing term. The term frictionless e-commerce or more broadly frictionless retailing was coined by Jeff Bezos of Amazon.com.

Jeff’s core idea was the making online discovery frictionless. Over the time new contexts have been added in the list like frictionless checkout, frictionless browsing, frictionless customer acquisition, frictionless subscription but none of them are more important than discovery.

So focus on discovery first. In terms of frictionless discovery,

  • Product search and taxonomy is foremost important
  • Develop information architecture around search and taxonomy
  • Once you above two right, focus on a content strategy (better product title and description)
  • Product recommendation is next, make sure you use the taxonomy for context based recommendation

Most of e-commerce companies make mistake around product search and taxonomy. E-commerce taxonomy is a science and most powerful product taxonomies are developed around ontologies, well-defined semantics and controlled vocabularies. Create a taxonomy strategy around hierarchical categories and tags. Develop clear distinction between categories and tags make sure your customers understand it.

E-commerce taxonomy is one of the core expertise of iCODONS team. So talk to us and learn how we can help you to develop a taxonomy which will scale with your catalogue (1 million or 10 million products).

 

 

Universal-Analytics

After 5 months in closed alpha, last week the Google Universal Analytics went public beta. This is one of the very powerful offerings by Google Analytics Team.

Let face it – universal tracking is hard. Consumers are using multiple devices. Businesses are using new platforms. Universal Analytics brings the analytics for multi-channel, multi-platform and multi-device at one place.

Universal Analytics can be used to measure and track multi-channel interactions between customer and your business (offline or online). It enables to measure user activity in any digital environment as long as it is connected to internet. It opens up opportunities to collect and stream user-interaction (event/hit) data from any kind of digital device using HTTP measurement protocols. It can be a Point of sale (POS) system or supermarket racks equipped with new kind of sensors collecting and streaming data over HTTP.

One of the important use case will be measuring the customer loyalty – how people become customer and remain loyal. Using Universal Analytics your business will be able to measure offline to online conversion and vice versa. In nutshell Universal Analytics will enable discovery of relationships between the channels that drive conversions.

Google is a digital company

Google was used to be a search company, a technology company but Google is now turned into a digital marketing company. This transition is quite visible and it is good for Google’s long-term future.

Make no mistake but this is the future of most of technology companies. Yahoo is still considered as media company. At end of the day for most of the companies, technology is just a mean to achieve the right kind of outcomes.

One of the best thing about companies like the Google, Facebook, Yahoo and Twitter is that although they are labelled as digital or media companies they are run by engineers. This may be ironic but true. As Jeff Hammerbacher suggested,

The best minds of my generation are thinking about how to make people click ads. That sucks.

 

No doubt, cloud computing is hot at the moment. Everyone is jumping onto the bandwagon before it become too late for them.

Currently data clouds seems to be a major focus for most of the companies and institutions adopting cloud computing in their long term strategy. These organizations are using data clouds for both on demand computation and to persist and manage the data. Distributed and replicated data clouds not only enable the faster access to resources but they also ensure the higher availability, scalability and fault-tolerance. Data clouds have proven to be highly attractive for scientific community as well. Large scale genomic analysis on the cloud is one of the many examples where community is enjoying best of on demand computing and storage technologies. Most of use cases in life sciences community are focused around mining the huge amount of data produced by third generation sequencers.

Another niche area where cloud computing is making its way is simulation based science and engineering. Compared to data clouds modeling and simulation of various science and engineering problems using scalable cloud computing environments is still in fancy stage. There is some excitement with Amazon’s recently announced High Performance Computing (HPC) cloud services, but there is lot of uncertainty to what extent cloud based HPC clusters can compete with on-premise HPC clusters or in-house dedicated machines. For instance, it remains to be seen  how multi-tenancy in the cloud will react to the HPC performance. Exclusive access to a cloud computing node are way too expensive for both cloud infrastructure providers and users especially when scientific applications require large numbers of nodes. In addition dedicated or exclusive nodes don’t fit very well with economies of cloud computing, in fact multi-tenancy is a prerequisite for the cloud computing. There are some concerns over processing, memory, storage, and network usage patterns in shared multi-tenancy environments. This is an open and unexplored area for both scientific community as well as cloud infrastructure providers. Before people start adopting the cloud based HPC services,  these concerns need to be addressed and explored through the various benchmarking studies. As Mohamed Ahmed suggests,

Cloud infrastructure is still lucrative if comparing its economics to building in-house HPC machines. However, cloud for HPC has to be efficient enough to reach proper performance ceilings without disappointing customers who probably experienced at a certain point to run their HPC applications on dedicated machines.

I could not agree more. Lack of performance guarantees against shared cloud infrastructure is major issue which cloud computing users are facing on regular basis irrespective of type of application they are running. Currently most of cloud infrastructure provider guarantee only the uptime of their nodes and in some cases they provide persistent access to the resources by fully reserved RAM and storage allocations without over-subscription. Some of them guarantee a minimum CPU availability proportional to reserved size but often there are huge gaps between what is promised and what is delivered. There is growing demand for Service Level Agreement (SLA) which should cover both performance and availability. Compared to HPC applications, for the data clouds performance is not a major issue. HPC applications are computationally intensive and they can be highly demanding in a given time period while data clouds behave uniformly.

In next few weeks through a series of blog posts we will focus on some interesting modelling and simulations applications for high-throughput computational science built around cloud based HPC clusters. So stay tuned.

IBM expert Jai Menon explains how organizations can build systems that recognize patterns in data, draw predictions on what might happen next, and prescribe solutions for the future.

Just a starter post for our brand new blog, an inspirational TED talk by Simon Sinek. So if you like the video, be sure to leave us a comment.