Wednesday, 23 May 2018

Mobile App Design Best Practices and Mistakes

www.toptal.com

BY CAMERON CHAPMAN - DESIGN BLOG EDITOR @ TOPTAL
In 2017, over 91 billion apps were downloaded from the iOS App Store and Google Play (which doesn’t include all of the third-party app stores and app stores for other platforms). That’s a lot of apps—roughly 13 per person—on the entire planet. With so many apps being downloaded, it’s no wonder that the average app has a churn rate of 57% in the first month (users who don’t open the app more than once during the first 30 days after downloading it) and a whopping 71% after 90 days.
If any part of an app is undesirable, or slow to get the hang of, users will opt to install a new one rather than stick it out with the imperfect product. Nothing is wasted for the consumer when disposing of an app, other than possibly a few dollars (and they know they can download the app again at any time). The only loss is the time and effort of the designers and developers.
Designers ought to observe mobile app design best practices.
So, why is it that so many apps fail? Is this a predictable phenomenon that app designers and developersshould accept? For clients, is this success rate acceptable? What does it take to prevent your apps from being deleted without a second thought?
The most common mistakes span from failing to maintain consistency throughout the lifespan of an app to difficulty attracting users in the first place. It’s challenging to design an app with intuitive simplicity without it becoming repetitive and boring. An app has to offer pleasing design and UX details without losing sight of a greater purpose.
Most apps live and die in the first few days, so following some basic mobile app design best practices and avoiding the most common mistakes will help designers create apps that live past that 90-day mark.

Common Mistake #1: A Poor First Impression

Often, the first use or first day with an app is the most critical period to hook a potential user. The first impression is so critical that it could be an umbrella point for all of the other mobile design best practices. If anything goes wrong, or appears confusing or boring, potential users are quickly disinterested.
The proper balance for first impressions is tricky, though. In some cases, a lengthy onboarding process to discover necessary features can bore users. Yet without proper onboarding, some apps will just confuse users if they’re not instantly intuitive. Creating an app that is immediately intuitive while also introducing users to the most exciting, engaging features quickly is a delicate balancing act.
Although it can be a good way to get someone quickly oriented, drawn-out onboarding can also stand in the way of users doing what they want to do with the app. Often, these tutorials are too long and are swiped through blindly.
Keep in mind that when users first use an app, they don’t necessarily have any waypoints for how the app should function or what it can do. Proper beta testing process allows designers to learn how others perceive an app from the beginning. What seems obvious to the design team may not be for newcomers.
Mobile design best practices include good onboarding
Mobile onboarding design (by Johan Adam Horn)

Common Mistake #2: Designing an App Without Purpose

Avoid entering the design process without clear intentions. Apps are too often designed and developed in order to follow trends rather than to solve a problem, fill a niche, or offer a distinct service.
For the designer and their team, the app’s purpose will affect every step of a project. It guides every decision, from the branding or marketing of an app to the wireframe format to the button aesthetic. If the purpose is clear, each piece of the app will communicate and function as a coherent whole.
Conveying this vision to potential users means that they will understand the value an app brings to their life. The vision needs to be clearly communicated from the user’s first impression. How quickly can the vision for the app be conveyed to users? How will it improve a person’s life or provide some sort of enjoyment or comfort? As long as an app’s usefulness is conveyed immediately to users, it’s likely to be part of the 21% of apps that make it past the first 90 days.
When entering an existing market, there are apps designed for that space designers can study as a baseline. They can improve upon what is already out there or provide a unique alternative in order to stand out. They shouldn’t thoughtlessly imitate.
Careful mobile UX design will follow mobile user experience best practices.

Common Mistake #3: Failing to Optimize User Flow

Designers should be careful not to skip over thoughtful planning of an app’s UX architecture before jumping into design work. Even before getting to a wireframing stage, the user flow and structure of an app should be mapped out. Designers are often too excited to produce aesthetics and details. This results in a culture of designers who generally under-appreciate UX and the necessary logic or navigation within an app.
Slow down. Sketch out the flow of the app first before worrying too much about the finer details. Apps often fail from a lack of flow and organization, rather than imperfect details. Once the design process takes off, always keep the big picture in mind. The details and aesthetic should then clearly evoke and reinforce the greater concept.
Mobile UX best practices include a well-defined user flow.
A well-thought-out user flow diagram (by Michael Pons)

Common Mistake #4: Disregarding App Development Budget

As soon as the basic features and functions of an app are sketched, it’s a good time to talk about the budget with the development team. This prevents spending a ton of time designing features and UX patterns that end up needing to be cut when the development team doesn’t have the resources to implement them.
Learning the average costs of constructing particular concepts is a valuable addition to a designer’s toolkit, as it makes it easier to adapt design thinking to economic constraints. Budgets should be useful design constraints to work within, rather than viewed as frustrations.

Common Mistake #5: Cramming in Design Features

Hopefully, rigorous wireframing and prototyping will make the distinction between necessary and excessive functions clear. Each individual mobile platform is already the ultimate swiss army knife, so your app doesn’t need to be. Not only will cramming an app with features lead to a disorienting user experience, but an overloaded app will also be difficult to market.
Cramming too many features in is bad mobile app design.
Many failed apps try to cram too many features in from launch.
If the app can’t be explained in a concise way, it’s likely trying to do too much. Paring down features is always hard, but it’s necessary. The best strategy might be to gain users in the beginning with just one or two features before testing new additions in later releases to see what resonates with users. This way, the additional features are less likely to interfere with the crucial first few days of an app’s life.

Common Mistake #6: Dismissing App Context

Although purpose and end goals are important, they become irrelevant if not directed within the proper context. The UI for a given app may seem obvious to the design team, but first-time users and users from different demographics may not find it as intuitive. For example, millennial users of an app might find certain functions intuitive, while retirees might find those same things confusing (or vice versa).
Consider the immediate context or situation in which the app is intended to be used. For example, Uber’sinterface excels at being used very quickly. This is perfect because when a user is out with friends and needs to book a ride, they barely have to interrupt their conversation in the process. Uber hides a lot of support content deep within the app that only appears when the scenario calls for it.
Is your app meant to be accessed quickly and for a short period of time? Or, is this an app with lots of content that allows users to stay a while? How will the design convey this type of use? Consider these points carefully when mapping out your app’s UX flow.
Following app design best practices, designers need to consider context.
Good mobile design should consider the context in which it is used.

Common Mistake #7: Abusing Notifications

Push notifications are a finicky part of app design best practices. Too many, and users will turn them off entirely, risking the app being forgotten about. Too few, and the same fate occurs.
But it’s not just the frequency of notifications that can turn users on or off. It’s also the content. Useful notifications, such as those notifying users of a new message or reminding them to make a daily check-in, are seen as helpful and necessary. Apps that send seemingly random updates or notifications about news that doesn’t directly affect the user are more likely to see their notifications turned off completely.
Every notification is a microinteraction that can either enhance the user experience and reinforce the overall usefulness of the app or risk alienating users and, in extreme cases, prompting them to delete the app all together.
Push notifications are an important part of good mobile design.
Push notifications are a delicate balancing act in good mobile design (by Jona Nalder).

Common Mistake #8: Overcomplicating App Design

The famous architect Mies Van der Rohe once said, “It’s better to be good than to be unique.” It’s vital that the design meet the specs in the brief before designers start breaking the box or adding other flourishes.
Design elements added to make a composition more visually appealing still need to add value to the user experience. Continue to ask throughout the design process, how much can I remove? Design reductively instead of additively.
Over-complexity is often a result of unnecessarily breaking conventions. Will the app really benefit from reworking the standard symbols and interfaces within mobile visual and tactile language? Standard icons have proven themselves to be universally intuitive. Thus, they are often the quickest way to provide visual cues without cluttering a screen.
Don’t let design flourishes get in the way of the actual content or function of the app. Often, apps are not given enough white space. While it’s vital to good design in general, it’s especially important for mobile designs, as a cluttered interface isn’t particularly touch-friendly.
Mobile app design best practices often require a reductive process.
The app design process can be reductive, rather than additive.

Common Mistake #9: Design Inconsistencies

If a design is going to introduce new standards, they have to at least be consistent across the app. Each new function or piece of content doesn’t necessarily have to be an opportunity to introduce a new design concept.
Is the text uniformly formatted? Do UI elements behave in predictable, yet pleasing ways throughout the app? Design consistency must find the balance between existing common visual language and avoiding being aesthetically stagnant. The balance between intuitive consistency and boredom is a fine line.

Common Mistake #10: Under-utilizing App Beta Testing

All designers should analyze the use of their apps with some sort of feedback loop in order to learn what is and isn’t working. A common mistake in testing is for a team to do their beta testing in-house. It’s imperative to bring in fresh eyes in order to really dig into the drafts of the app.
Send out an ad for beta testers and work with a select audience before going public, or use a testing service like UserZoom. This can be a great way to iron out details, edit down features, and find what’s missing. Beta testing can be time consuming, but it’s definitely a better alternative to developing an app that flops.
Following mobile UI design best practices will yield superior results.
Designers should pay special attention to mobile user testing as potential users are fickle (source: Fake Crow).
It’s important for design teams to recognize just how competitive the mobile app market is and to do whatever possible to differentiate their offering from the hundreds or thousands of other apps occupying the same space. To do this, they need to have a coherent vision of what the mobile app is hoping to achieve. Following mobile app design best practices and using an iterative design process that incorporates user feedback into the entire process is one of the best ways to do this, and will create an app that stands out.

UNDERSTANDING THE BASICS

How is beta testing done?

Beta testing is done by real users in a real environment as the final testing step before an app or product is released "live" to the public. Beta testing can be done via apps that record actual user behavior or in a more simplified way by using user interviews and surveys

Proxeus CEO Antoine Verdon on Making Blockchain Accessible to All

coincentral.com
Resultado de imagen para Proxeus CEO Antoine Verdon on Making Blockchain Accessible to All
Antoine Verdon is the Co-Founder and CEO of Proxeus, a company working to make blockchain compatible with traditional companies and existing enterprise infrastructure. Founded in 2015, Proxeus aims to be “WordPress for the blockchain,” making it easy for organizations of all sizes to run their businesses on the blockchain.
The Proxeus team has already demonstrated some very impressive results with their three-layer technology model–from speeding up the registration process for new businesses to enabling the University of Basel to use blockchain to issue academic credentials secure from fraud.
As a serial entrepreneur and fintech pioneer, Antoine was named among the 100 top Swiss personalities by magazines L’Hebdo and Bilan (2010 and 2013). He founded Legal Technology Switzerland to work on delivering more accessible legal services through digital channels. He’s also a strategic advisor to Swiss banks on fintech topics.
There are many theoretical use cases for blockchain, but in the following interview, we find out how the technology is being applied already–and how it can be rolled out to all businesses.

Your background is in law and you’re well-known for your passionate stance on politics. So, how did you get into the blockchain space?
I actually never practiced law. During my last year of law school, I co-founded Sandbox, a global network of young entrepreneurs. Later, I moved into venture capital and was leading a fintech fund based in Zurich. In 2012, we started looking at blockchain as a technology that could profoundly disrupt banking. Later, we invested into a US bitcoin exchange and I observed the Ethereum ICO with great interest–that was the point when I understood that blockchain wouldn’t just change banking, but redefine the work processes and business models of every single industry.
What is Proxeus all about and what problem are you trying to solve? How does the XES token work?
Our goal is to make blockchain technology more accessible. We want to make it possible for anyone to create their blockchain applications without requiring specific technical knowledge. The XES token connects our users in a decentralized ecosystem, without the need for us to stand in the middle of every transaction.
You dub yourselves as “WordPress for the blockchain,” – what do you mean by this?
20 years ago, every website had to be coded in HTML manually. WordPress has changed that by allowing anyone to program a website structure in just a few clicks. We are doing the same for blockchain applications: in just a few hours, you can create the skeleton of a blockchain application for your project or company.
Proxeus claims to be blockchain agnostic, how are you able to manage this and why is this important?
In the blockchain development stack, we are positioning ourselves as a middleware, that can connect to any blockchain. When you send out an email with Outlook you do not know if the protocol used is smtp or pop3. The same will be true with blockchains in the future.
Can you tell us about your three-layer technology model? Who is Proxeus for? Who are the types of people that will interact with your product?
Similarly to WordPress, we have a core platform, completed by a series of modules, which combined with a font end allow the creation of decentralized applications. Anyone interested to prototype and eventually build a blockchain application will be able to use Proxeus to create the framework.
Why is it important to allow outside developers to create DApps?
DApps represent a new way to build applications. In a traditional setup, developers must manage big and complex applications with multiple permission layers. In a DApp setup, everyone has their own small DApp, connected by the blockchain. The development and administration costs can be considerably reduced.
In your whitepaper, you liken the crypto-space to when businesses had to adjust to computers in the 70s, internet in the 90s, cloud in the 2000s, and mobile in 2010s. Is this really the next big change?
Blockchain will bring a whole new wave of efficiency and trust. It will transform traditional business models in the same way as Internet has transformed commerce and communication.
How can businesses embrace blockchain faster, especially small-to-medium businesses with lower budgets? What will happen to businesses slow to adapt?
Small business may not be able to implement blockchain-based processes immediately, but those that started their learning curve early will have a clear advantage, and the laggards will eventually disappear. The goal of Proxeus is to provide a sandbox where everyone can come and test how their business can be run in a blockchain context.
How long do you think it will take before we see mass adoption of blockchain tech and what are the barriers to that happening?
We expect first live applications to be started this year, but it will take another 3-5 years before blockchain tech is adopted by the masses.
You recently demoed your product with some amazing results, helping the University of Basel to become the first in Switzerland to use blockchain to issue course certificates and secure academic credentials from fraud. Can you tell us how you did that?
We programmed the course certificate digitally so that the responsible person can enter data and have the document created automatically. The document is hashed and the hashed is registered on the blockchain, allowing future employers to verify the authenticity of the diplomas they receive using a simple drag and drop interface.
What are the impacts of this on higher education?
It will simplify the interaction between employers and universities at first, but it will also create a broader ecosystem of trust around academic credentials – I wouldn’t be surprised if, at some point in Linkedin, you could connect your crypto-identity in order to display “verified diplomas” on your profile.
It will also considerably reduce the administration work for Universities. In the end, isn’t a masters degree the result of a sum of conditions that could be programmed into a smart contract?
You also used Proxeus technology to legally register a company, from start to finish, in under 3 hours (instead of the 4 – 6 weeks it normally takes) at the DigitalSwitzerland challenge. How was this possible?
Currently, in Switzerland, it takes about 10 days to create a company. This is due to the fact that the actors (entrepreneur, lawyer, bank, notary, company register) work in a sequence. With a smart contract on the Hyperledger blockchain, we could change that and register a company legally from start to finish in just 1 hour 37 minutes.
What other main use cases do you see Proxeus being applied to?
Every business will be able to automatize their processes and connect them to smart contracts to move to the next stage of digitization.
Where did Proxeus get its name?
Proxeus is the Greek god of proxies, building a bridge between the traditional and the blockchain worlds.
What’s a typical day like for you? Do you have any free time at all?
I usually take some time to read in the morning, and whenever I can, I cook in the evenings. But those times, unfortunately, get rarer as we are now into the execution phase of the project – evenings and weekends are busy and I have a heavy travel schedule ahead.
Not a lot of people know this, but you were a saber champion in Switzerland, do you still get to practice at all?
Unfortunately, I didn’t get to practice lately! I live close to the lake in Zurich where I go jogging and sometimes swimming.
You call yourself a serial entrepreneur. So, what are the next steps for Proxeus and the next steps for Antoine?
For now, I am all-in on the current projects – Proxeus and blockchain more generally will still occupy me for a few years!

This article by Christina Comben was originally published at "CoinCentral.com": https://coincentral.com/proxeus-ceo-on-making-blockchain-accessible-to-all/

Friday, 18 May 2018

How Much Does It Cost to Develop An IoT App

customerthink.com
Image result for iot app
The world around us is the world of the Internet of Things. Different devices communicate among themselves, forming networks, which are combined with each other and with the Internet. IoT is almost in all spheres of our life, and now it’s becoming popular in smart home systems, medicine, and machinery. The fields where we can use it is growing rapidly.
Many companies are interested in this technology and are already seriously engaged in its development. We talk about such giants as Google, Samsung, Apple, LG, Intel, Qualcomm and many others.
According to the research of International Data Corporation (IDC), a market transformation will increase expenses for the IoT segment from 1.9 trillion in 2013 to 7.1 trillion in 2020. And according to the forecast of Gartner analysts, by 2020 the number of connected devices will be almost 10 billion. The growth of the IoT-device market will create a situation when the number of connected Internet of Things devices will exceed the number of connected mobile phones in 2018. These statistics were presented by Ericsson.
Now we have smart cars, houses, appliances, toothbrushes, watches and even forks. Mobile applications for smartphones serve as a link between users and their devices because through them, we can manage the Internet of Things devices. The IoT development, and especially the creation of mobile applications, carries a huge number of opportunities and benefits. That’s why IoT technology has become one of the priority areas of mobile application development in recent years.
Building an application for the Internet of Things is quite a difficult process which requires a lot of time and resources. You have to plan your budget before ordering it.
What things should you keep in mind? What aspects influence the IoT application development costs? Let’s review how much the average app can cost.
What may influence the app development cost
Like any other apps, the IoT app development depends on many factors:
i) What kind of app you need and how complex it is. The simpler the app, the less money it costs.
ii) The amount of time required to develop an app. This point directly depends on the type and complexity.
iii) The number of team members you are cooperating with.
iv) The hourly rates of the programmers involved. It depends on the region they live in.
Now let’s see each factor in details.
The factors influencing the IoT development cost
Kind of application and its complexity
Before starting, the team of developers must attentively analyze the app’s goals and how it will be positioning. Based on the list of features, they will be able to tell the final price. The more features they will have to implement, the higher the cost will be.
The amount of required time
The process of IoT application development consists of three main stages: designing, developing, and post-development. Each step is mandatory, and each team must follow them to achieve the greatest results.
The designing stage in its turn consists of the following substages: information analysis, prototyping, and adding visual components. Each stage requires a certain amount of time:
i) Information analysis — 20-40 hours
ii) Prototyping — 40-80 hours
iii) Adding visual components — 80-300+ hours
Given amounts are average and may vary depending on the app.
On the developing stage, the team creates the app according to the design. The average time reaches 400-1000+ hours.
Post-development includes the testing and bug fixes. This stage may take the development team 35 to 170 hours.
The number of team members
Depending on the app, the development team may include a different amount of members. A simple app will require fewer team members, while complex projects will require a bigger team.
An average team may include:
i) 1 designer
ii) 2 developers
iii) 1 tester
iv) 1 project manager
However, the extended team for a complex IoT development project will include back-end developers, business analysts, panel designers, and administrators. Again, this is an average and may vary project by project.
Hourly rates of the programmers
The hourly rate of programmers depends on the country or region they live in. The most expensive professionals are in Western Europe and the USA, and the cheapest is in South East Asia. Here are some basic rates:
i) Eastern Europe — $30-50 per hour.
ii) North America — $50-150 per hour.
iii) Western Europe — $65-130 per hour.
iv) South East Asia — $20-50 per hour.
Remember that the rate may vary and also depend on the programming language and the experience in IoT development.
The average price of developing the IoT app
We discussed the factors which may influence the IoT application development cost. Here are some average prices depending on the app type.
i) Simple app — $1,000-$4,000 — you will have to provide all the necessary content, clear instructions, and similar apps. If you are good with Photoshop, you may also provide the graphic design. Any additional components will cost more.
ii) Native app — $8,000-$50,000 — again, you will have to provide absolutely all the content. The development includes architecting and usability creation. Such IoT solutions will require a lot of front-end work.
iii) Game — $10,000-$250,000 — the hardest projects. Most modern games are far more than $100,000 dollars as they require a lot of effort of highly professional specialists. The more tricky elements, the higher the efforts.
iv) Additional features may include the following:
A) In-app payments — $1,000-$3,000 — the user may get other elements or even buy the whole app. The price depends on the payment system, location in the app or on the server, as well as complexity.
B) Web services — $1,000-$3,000 — the ability to create the remote access point and update the app content with XML instead of the code changing. Before developing, contact the engineer to discuss how deep you would like to implement it — it will save a lot of your effort.
C) API integration — $500-$1,500 — with APIs, you will be able to integrate a lot of third party services like social networks, email, photo stocks, and other platforms and apps.
IoT application development is a very serious process and very often requires a lot of efforts.
Now you know the average budget, and it will really help you to create a really useful IoT solution.
Good luck!

Common App Store Optimization pitfalls and how to avoid them

betanews.com
The faces behind the biggest apps in the world
Making an app is hard, but getting it noticed on the Apple App Store and Google Play Store can be an even more difficult task. Optimizing your metadata is the biggest hurdle to getting noticed and finding users.
There are many "tips and tricks" articles out there that promise to help, but very few talk about what to avoid. Fortunately, with a strong App Store Optimization (ASO) strategy, you can stand out from the crowd while keeping an eye out for ASO pitfalls. By maneuvering around these pitfalls, you’ll be one step closer to improving your app’s visibility.
  1. Keyword Fumbles
Keywords are incredibly important for getting your app noticed. After all, they’re what users input into their search query, so it’s vital that they’re incorporated into the metadata properly.
On the iOS App Store, you’re given a 100-character keyword bank to work with. From there, you can also incorporate keywords into your title and subtitle. This means that each word must count, but if you use low-volume or irrelevant words, they don’t.
Repetition, for instance, is redundant. Words used in the title and subtitle are already counted among the keywords and should not be used in your keyword bank. Using them again adds nothing and takes space that could otherwise be given to important keywords.
You’ll want to make sure to include these keywords in the description as well, however, overusing them leads to another common pitfall: keyword stuffing. Cramming the same keyword into a description too many times will get the app flagged, so use each in moderation.
On the Google Play Store, keywords work a little differently. Google’s algorithm crawls the descriptions for relevant keywords so users can find relevant apps that match the particular words or phrases used in their search query. You’ll want to include a good mix of single keywords and long-tail keywords (phrases) that address your app’s mechanics and benefits. However, if these keywords are misused, developers hit another stumbling block and fail to make coherent sentences that use the keywords correctly.
  1. Small Fish Going After Big Fish
It might be an understatement to say there are a lot of apps on the market. Nearly any given app will be competing with others for clicks and installs. That doesn’t mean you shouldn’t make an app that would compete with them, but it does mean that you should be careful about targeting.
For instance, video streaming apps may find themselves competing with YouTube, Netflix, Hulu or Crunchyroll, to name a few. While leveraging searches for those apps via similar keywords is a good way to show up in some frequently-searched terms, the goal is to show up alongside them and differentiate from them, not to copy them completely.
If users have a choice between YouTube and a nearly-identical video app called "UsTube,"  they have no reason to go for the knockoff. However, if a second app shows up while searching for "YouTube" that offers different features, then curious users may download that app as well. Competitors should also have some variations in their keywords so they don’t always show up behind the larger, more popular apps.
In short, too many apps linger in obscurity when they try to simply copy the success of their competition. Those that succeed tend to find a way to differentiate themselves and offer something new and valuable while leveraging the success of their competitors.
  1. Poor Descriptions
Once the app gets noticed, a good description is vital to sealing the deal and getting users to click "Get." Unfortunately, the description tends to be the greatest pitfall of them all.
A good app store description must be many things. It needs to:
  • Properly entice potential users
  • Demonstrate the app’s value
  • Utilize keywords to ensure the app appears in user searches
While the list of things a description should be, some developers fail to achieve any of the above points. Sometimes descriptions are too long and turn users away with blocks of text. Having such a large description means developers are failing at a key point: explaining to its audience what they have to gain from the app. If users don’t understand the purpose of the app, they then feel no incentive to install, and go with another app.
It’s important to remember that the App Store description is the creator’s direct pitch to potential users. This is where they encourage them to install the app; falling short here could be a crippling stumbling block.
Fortunately, there are tricks to writing a great App Store description. Learn them well, and you can avoid these mistakes.
  1. Unimpressive Creatives
On both the iOS and Google Play stores, creatives are a key component to catching a potential user’s eye. The app icon, screenshots and preview video should demonstrate exactly how an app works, what it can offer and all the features therein.
Creating unique, engaging screenshots is vital. If the screenshot is unappealing, doesn’t demonstrate the value of the app, or doesn’t catch the user’s eye, it could be useless at best and detrimental at worst.
Similarly, the preview video must be engaging enough for the user to watch. Catchy music and in-app images that showcase the importance of its features are key to a successful video. If they don’t want the app by the time the video is over, it didn’t do its job.
Strong visuals are key, so to avoid this stumbling block, make sure each image and video demonstrates the app’s value and isn’t too cluttered. Also, make sure to follow any specific guidelines laid out by Apple or Google to avoid rejection.
Now You Know
Despite there being some common stumbling blocks on the way to developing a strong App Store Optimization strategy, there are many ways to avoid them. Strong keywords, well-written descriptions, marketing and creatives all go hand-in-hand and lends to improving an app’s visibility in the app stores. Each link in the chain is important, so make sure you don’t kink one or it could hurt your app’s discoverability in the long run.

Wednesday, 16 May 2018

Applications of Statistics for Measuring Company Growth

www.toptal.com
BY ERIK STETTLER - FINANCE EXPERT @ TOPTAL

Executive Summary

Once you have obtained your top-line growth metrics, the analysis can really begin.
  • There are many factors both internal and external that implicate upon a company's top line figures, such as revenue and user growth.
  • What is key is to be able to isolate the effects of intended actions, like marketing and PR, to understand how effective they were for future use.
  • Tools, commonly applied in financial markets, can easily be applied to conventional business practices.
Figure out what aspects of growth you want to measure, then create a benchmark.
  • Business growth strategies have three variables that can be deconstructed and measured: top-line user growth, retention, and engagement
  • A simple regression model, most commonly via the Ordinary Least Squares method, can then ascertain the benchmark "normal" growth experienced by the business. This can test a range of internal and external forces for their influence on growth performance.
  • Don't treat each growth strategy employed as an isolated event—a series of events may have varying performance at each point in time, but their impact overall may combine to produce significant results
Start building an analysis process: Weed out red herrings and look towards continual improvement via machine learning.
  • If a significant but one-off event (like a C-suite resignation) occurs, mark the data point as a confounded unrelated event, with an indicator variable. These events in themselves can also be tested over time for their isolated effects.
  • Take into account the non-linearity of certain events occurring at the same time. The positive-negative effects may not be the same, as in isolation. We see this in public markets, where companies can "take a bath" by releasing a deluge of negative news all at once, leading to an initial "fixed" hit from the fact of bad news, with marginal subsequent effects.
  • Also start building a process for your growth metrics—as you garner more data over time, the accuracy of the insights will increase.
  • Look at automating data feeds into the process and harvest data from other areas of the organization (for example, link to GitHub to test for the effect of software updates). Over time, applying iterative machine learning principles to your growth measurement will only increase its value toward your company understanding its progress
In part as a follow-up to my previous article on how to identify the drivers of growth in businesses, I now want to go further down the rabbit hole and look into how you can then measure the impact of growth initiatives. I will provide some tools for assessing the impact of actions such as product updates, PR, and marketing campaigns on customer growth, retention metrics, and engagement. This represents reflections from my previous work as a statistician, helping companies to assess the impact on their valuation of internal and external events via the reactions of their traded securities.
I believe that statistical impact tools, more commonplace in the hedge fund and Wall Street world, can be of far more use to technology companies for managing growth than how they are currently applied. Due to technology making a range of high-frequency information available to us on user or client behavior, a skilled statistical or data analyst can be a real asset within commercial teams.

There Are Many Ways to Measure the Impact of Growth

As an example of measuring statistical impact on valuation, let’s assume that a publicly-traded company announces a new product and wishes to know the extent to which it impacted its valuation. Estimating the real impact requires accounting for:
  1. How the market itself performed that day, in the context of the security’s correlation with it.
  2. The effect of any other company-relevant information released at the same time.
  3. The simple fact that securities prices and user behavior move on a daily basis from general variance, even in the absence of new information.
  4. Longer-term impact, in terms of a statistically significant trend in price increase.
For a private company, the same analysis can be done on the change in active users, or clients, both in the short- and long-term, which serve as the corollary to stock price activity. This also applies to retention and depth of engagement metrics.
Establishing this rounded form of analysis allows companies to direct their limited resources based on far stronger information signals, rather than get led astray by what might appear to be a market or user reaction that in fact represents nothing more than random fluctuation. The initial work to set up the statistical model that separates the signal from the noise can yield tremendous dividends via the insights that it brings to a company’s growth efforts. It is also an iterative process that can easily (and often automatically) be updated and refined as new data is received.

Selecting the Target Metric to Test

Any measuring effort by a company should target at least one of the following dimensions of growth:
  1. Top-line growth, defined as the change in total sales or active users/clients over time.
  2. Retention of users and clients, defined as the average lifetime of any given user or client.
  3. The depth of engagement of users and clients, defined as either the frequency of the core action taken or volume of transacting via the platform.
Graphic representation of the value of growth triangle
All three dimensions are quantifiable, and the company can conceptualize its value as the area of the triangle formed by these three points. If one collapses, then the value potential from the other two are severely constrained. While I certainly agree with many founders and investors that “a few users who love you is better than many who like you,” I do not believe that this contradicts the importance of top-line growth in addition to strong engagement and retention. The trajectory matters much more than the level, and beginning with a smaller group of truly dedicated users best sets the initial conditions for long-term growth in the first place.
The key task for the company is to then establish the analytical framework that allows for measuring the true effects of their actions on one or more of those three key metrics. The company may either test different models for each or use tools such as simultaneous equations to link them more directly. Marketing and PR efforts, in my experience, tend to particularly suffer from a lack of rigorous analysis on whether the company is receiving a return on its investment. Certain metrics, such as total views, clicks, and shares, are almost always recorded, but these are all means to an end and the next question of the effects on customer conversion and engagement are rarely given serious analysis.

Choosing the Benchmark and Example of a One-time Event

We begin with the simplified version of a one-time event. Let’s assume a company releases a new product update or publishes a big PR story on Day 0 and wishes to know whether it represents a move in the right direction in terms of effect on growth. Determining whether a real signal has been received that the company should continue with similar efforts requires knowing how much it increased, versus how much it would have, absent the event in question.
The benchmark growth can be estimated via a regression model that predicts the company’s growth, retention, or engagement based on external and internal variables. In certain cases, the ability to isolate those users that are affected by a product update allows for direct A/B testing with a control group. This is not the case, however, for larger-scale product, PR, and business efforts that affect all current and potential users somewhat uniformly. While there are some excellent resources available for such testing, many early-stage companies can find them expensive.
Variables that can be considered for this model include:
Sector trends
  • Growth in your relevant sector in terms of total volume of sales.
Target customer trends
  • Differs from sector trends in that this focuses more on the growth of your target customers themselves, whether or not they are already doing business with your sector.
The S&P 500 plus additional sector-relevant sub-indices
  • If your clients are financial firms, or may be affected by direct or psychological effects of the capital markets.
Macro variables such as interest rates and exchange rates
  • Depending on your business model, interest rates and exchange rates may affect the competitiveness of your offering.
Internal drivers such as referral rates
  • Any company's growth is a combination of external and internal factors. Internal metrics that are important to track in their own right, such as referral rate from current users (which could be an important momentum effect), user satisfaction ratings, social media activity, and so forth may all be useful.
Seasonality/cyclicality
  • Indicator variables, which equal 1 if a condition is met (for example, month falls during holiday season) and 0 otherwise, can be used to control for the effects of any month of year/day of week that might be relevant to your user activity.
All variables should be specified as a rate of change rather than absolute level, using logarithms rather than percentages.
The time frame for each variable likewise needs to be carefully considered. Some variables are leading (the stock market for example is heavily based on expectations), while others such as user satisfaction ratings are based on past experience but certainly may bear relevance for expected growth.
For the regression itself, I recommend beginning with Ordinary Least Squares (OLS) and then only moving on to other functional forms for specific reasons. OLS is versatile and likewise allows for more direct interpretation of the results than other more complex forms. Modifications in the context of OLS would include a logarithmic regression for nonlinear variables, interaction variables (for example, perhaps current customer satisfaction and social media activity), and squaring variables that you believe have disproportionate effects at larger values. Since growth is hopefully exponential, logarithmic regressions could certainly prove a strong fit.
A chart displaying the difference between observed values and predicted values
Regarding the time horizon of impact of the action, be sure to consider your users’ frequency of actions or purchases to help you determine the proper interval over which to look for the impact. When using timeframes longer than one day, remember that weekly active users is not the sum of the daily active users that week. If I actively use your product every day that week, then I would be counted each day for a daily analysis. If you then change to a weekly analysis, I should only show up once and hence summing the individual days would over-count.
This model then allows you to estimate expected growth/retention/engagement for any given moment or ongoing time period based on the performance of these explanatory variables. The difference between this expected growth and the actual growth observed after the event is then the abnormal portion that may indicate impact. Dividing this abnormal growth by the standard deviation of the expected growth then indicates how likely the abnormal component was to occur by chance. Typically, a result of 1.96 (being approximately two standard deviations away from the predicted value) is used as the cut-off for deeming that it did not occur by chance.
In the context of cohorts, retention and engagement can either be considered in terms of change across successive cohorts (in other words, holding the values fixed for each cohort), or the change over time of total retention and engagement, without breaking it down by cohort.

Cumulative Growth Impact from a Series of Events

Growth strategies often make a point of deploying a series of events rather than one-time efforts, both for the more immediate impact of having multiple efforts and the underlying impact of showing customers the pattern itself. Impact analysis can therefore look at cumulative impact as well. A series of events that are individually insignificant can result in a significant cumulative impact, and conversely a series of significant events can net out into insignificance.
The first situation can be thought of as “slow and steady wins the race.” Let’s say your sales increase by a fraction of a percent per week more quickly than your relevant sector. Over a short period of time this would mean nothing, as any given company’s growth will slightly differ from the benchmark by chance. If your slight over-performance continues on for long enough however, then eventually you can state with confidence that the company’s growth rate truly does exceed the market’s.
The second situation is essentially any kind of reversal. The increasingly high-frequency means by which people can react to developments before truly processing the information, as well as short-term herd mentality, brings the challenge of ensuring that you consider the true magnitude and duration of the reaction through the more immediate noise. Under certain circumstances, users and the markets may tend to systematically over-react in the short-term (new technologies, currency markets, and often bad news that does not represent a serious threat to a company) but then later correct themselves.
The two situations can be illustrated as follows. The confidence interval indicates the bounds within which we can expect 95% of observations to fall, which is typically used as the threshold for deeming something statistically significant.
Chart showing the various confidence boundaries for a plot of active users over time
The absence of a significant reversal can be taken as evidence of lasting impact. One must be cautious with this logic as it runs contrary to the normal rule of empirical skepticism that absence of evidence is not evidence of absence, but it is the best we can do.
Be careful when comparing percent/logarithmic changes over individual time periods. A decrease of 99% followed by an increase of 99% does not exactly net out to an insignificant cumulative change. Be sure to consider cumulative change in the end.
If you are measuring the cumulative impact of a series of events such as a specific PR campaign within a limited period of time (i.e., a holiday season), then you may wish to track the growth over all calendar days or weeks included in the timeframe, whether or not each one had a specific action taken. You are still essentially hoping that the 1-2-3 punch yields a knockout within a specific period even if there might be slight delays between hits.
If the events in question are further apart but you still wish to assess cumulative impact, you may consider fitting them together into a single continuous series of days and then running the same analysis. In this, you are essentially saying “Day 1 is January 5, Day 2 is March 15, Day 3 is April 10…”) and testing their cumulative change vs. that predicted by the benchmark as if they were in fact completely sequential dates. Testing the significance is then the same formula as with singular events, except in this raising the standard deviation to the square root of the number of days/weeks that form the cumulative period.

Dealing with Contaminated Information When Measuring Company Growth

The world rarely affords us the courtesy of perfect laboratory conditions to test out our ideas, so once the core model is established, it will most likely need to control for other information that affects the expected growth rate at the same time as the actions we’re seeking to measure.
Let’s assume that at the same time as a PR event or product update, a top executive unfortunately decides to leave for a competitor amidst much fanfare from the press and you become concerned that some users might take this as a signal of the relative merits of the two products. One very quick solution unfortunately is to simply mark the data point as a confounded unrelated event, with an indicator variable.
However, if you can obtain data on previous instances of the “confounding” event, then you can conduct a cross-sectional analysis that allows you to predict how much of an impact that particular event tends to have in similar circumstances, and you can remove that expected impact from the final results. In the above example, data on user activity surrounding high-profile team member departures in other companies would allow you to estimate and separate out the effect of that particular factor in order to isolate the effect of the PR event or product update that you hoped to evaluate.
Many companies also may face seasonality based on time of year or certain key moments such as holidays. Assign indicator variables to the time of year in question to control for this.
Measuring Company Growth

The Non-Linearities of Certain Impacts

As you consider the results of your analysis and strategies for growth efforts, certain nonlinear effects in how people have been documented to react to positive developments are worth bearing in mind.
Up- versus down-sensitivity can be very different. Data and time permitting, consider estimating the expected effects of both positive and negative events, if both are relevant to you. Unfortunately, downward movements in many cases ranging from user behavior to the financial markets can be far more abrupt and severe than upward movements.
The combined effect of performing multiple actions at once may not equal that of performing them in sequence because the very fact of the ongoing pattern can itself have positive or negative effect. The pattern of a company releasing a product update every month can instill confidence in users while announcing negative events such as layoffs or write-downs more than once can have a vastly disproportionate effect by causing the worry that the company doesn’t fully understand its own situation. Publicly-traded companies will often “take a bath” and release all bad news at once, as there can tend to be an initial “fixed” hit from the fact of bad news itself, with a marginal subsequent effect. The “Torpedo Effect,” for example, describes the empirical phenomenon that the mere presence of bad news can account for a meaningful portion of a price drop. Negative drops can therefore be broken down into an initial fixed effect that gives way to a diminishing marginal effect from the actual content of the news or development. PR campaigns work better as a sequence than one singular mega-event, as the goal is to position the company over time.
Variance can of course only be measured historically, but certain events might change the underlying true variance and probability that the abnormal growth happened by chance. As the new variance is itself the result of the event in question, the prior variance should be used in order to avoid the circular reasoning of dismissing the significance of the event based on the larger variance that comes with it. As always however, there is debate and each situation may be different.
As previously mentioned, growth or a slowdown in growth can both beget similar effects for a while, due to both human psychology and very real market structures. While there are various fancy autocorrelation tests available for measuring momentum effects, I find the more “manual” approach of regressing the growth series on a lagged version of itself to be more transparent and easier to experiment with.

Concluding Thoughts on Approaching Machine Learning in Business

Once the model that allows for such testing has been developed, there is no reason why the company’s platforms for tracking user behavior, sales, etc. cannot be directly linked to its code to continuously update the coefficients as new data is received. My personal preference has always been to have a rolling one-year estimation period when possible, in that it balances the size of the dataset with the higher value of more recent information and also naturally includes all times of the year in case of seasonality.
Assuming no structural breaks in the nature of the business and product, there is no reason to not extend the estimation period beyond one year, but young rapidly-growing companies tend to evolve quickly. Software-driven companies could link directly to their GitHub to create the process by which software updates are automatically tested for impact. By creating this direct link and allowing the functions to evolve automatically, you have taken the first step toward deploying machine learning for your company.
It is often pointed out how information is the world’s most valuable commodity, but it is less often mentioned that data is not information. On the contrary, companies are overwhelmed with so much data that may seem to tell competing narratives, many of which can be just spurious patterns based on randomness. Statistics at its best is a process of reduction—of rapidly honing in on the key variables and relationships and deploying them for practical testing. The spirit of this form of analysis above all is imbuing healthy skepticism into the decision-making process by forcing the data to prove itself as real information before you base a decision off of it.

UNDERSTANDING THE BASICS

How growth is measured.

Growth is measured across three variables: top-line growth (change in total sales or active users/clients over time), retention (average lifetime of any given user or client), and depth of engagement (frequency of the core action taken or volume of transacting via the platform.