Analytics giant Gartner’s continues to be the pacesetter in software development. In the 2019 market guide, the analytics house continues to explore new players and markets that shape the industry, as well as emerging technologies that will dominate the industry. For industry players, Gartner’s report is a great resource to learn more about your customers’ changing trends, current preferences, challenges, and future data needs. Major takeaways in the 2019 guide are highlighted below:
Who are the major service providers in 2019?
Though Gartner doesn’t endorse any provider, its report gives crucial information regarding currently active players in the industry. Majority of the players listed in the market guide are major players in verticals such as data management, BI, and data analytics. The market guide lists vendors that specialize mainly in SME’s (small and medium enterprises) with a critical consideration in the geographical location of the vendors. Some of the listed vendors include Absolute data, Affine Analytics, Caserta, SDG group, EPAM, EXL, McKinsey digital, among others. Full list of the vendors is available here.
The dominating conversation in the 2019 markets is the concept of multiple service providers. Rather than relying on one all-in-one provider, Gartner’s recommends seeking different services from different providers. Multiple providers have different service capabilities, and by procuring from diverse providers, an organization benefits from various specialization capabilities.
Is Data science and artificial intelligence the industry’s game-changer?
Data science, artificial intelligence, and machine learning are some trending topics that have dominated Gartner’s reports in the past years. According to Gartner, “artificial intelligence (AI) and machine learning (ML) have attained a critical tilling point, and will continue to extend to the majority of technology services, applications or things.” This year’s guide shifts the conversation further. Instead of the traditional emphasis advocating the need for data science, this year’s focus is on the application of data science to customer needs. The emphasis seeks to find new ways to apply AI in creating customer-centric products.
According to Gartner, artificial intelligence will be among the best five investment portfolios for the majority of tech startups in 2020. The revenue from artificial intelligence software is forecasted to increase from $9.5 billion in 2018 to $118.6 billion by 2025. This revenue forecast has created a gold rush with every company positioning itself to reap the rewards of artificial intelligence. Some of the top industries which have invested heavily in artificial intelligence include:
- Business services
- Legal services
- Public sector
Gartner predicts that there will be a paradigm shift in artificial intelligence from skepticism to full dependence in the next decade. Most consumer applications are predicted to rely on artificial intelligence; thus there is a huge concern over the level of skepticism witnessed from consumers. Therefore, there is a need to invest in customer training to overcome this level of mistrust. Also, lack of crucial skills are another major hindrance predicted to limit penetration of artificial intelligence. Hence, there is a need to manage current talent while investing in new talent to handle the needs of data science.
What is the future of artificial intelligence?
The trends expected to dominate the world of artificial intelligence include:
- Machine learning ML is expected to be an integral part of most systems; both small and large systems. Owing to its growing dominance, machine learning is expected to be offered as a cloud-based service referred to as MLaaS (Machine-learning-as-a-service)
- Due to close interaction between AI and ML, artificial intelligence systems will continue to provide ML algorithms which will enable ML to ‘continue learning’
- Hardware vendors will experience a gold rush to provide CPUs with the capacity to handle large data qualities and ML data processing
- IoT ( Internet of things) is expected to support ML through the use of multiple technologies. More collaborative learning is expected to dominate the world of ML
- Software developers will continue to use personalized computing environments in the development of intelligent applications by employing assisted programming
- Quantum computing will continue to boost the performance of ML
Why is strategy consulting dominating vendor portfolio?
With the rapidly changing business environment, the need for strategy consulting among firms is increasing at a fast pace. According to Gartner’s estimated 18% of start-ups hired the services of a strategic consultant in 2018, and this growing demand has led to an increase in the number of companies offering strategic consultation.
How does strategic consulting work?
Strategy consultation employs in-depth data from many industries to help organizations implement the best strategy. For instance, company X has been experiencing losses and wants to close one of its branches. The strategic questions that company X must answer are:
- Which is the best branch to close without affecting the company’s output?
- How much money will the company save by closing the branch?
- How should the company’s operations be restructured to make up for the output of the closed scepticism?
- How much will the restructuring process cost?
Strategic consultants work with the organization in answering these questions. To get the best strategy, the consultant must consider each move from every angle. The process calls for wide consultations with the top executives, members of staff, government authorities, and other key players in the sector.
The grants guideline has outlined several benefits of strategic consulting. They include:
- Enables an organization gets a clear and non-biased opinion on the best strategic move to employ
- The strategic consultant should provide fresh new ideas to be pursued by the company
- The company can get insights on the most effective approach to pursue to ensure growth and business sustainability
- Enables an organization to achieve results-oriented strategies. The strategy aims to get pre-determined results.
Data architecture and Data lakes
The concept of data lakes has been dominating the software industry for a while. However, in the last two years, data lakes concept and application has been declining and seems to lose popularity. The 2019 grants report appears to have brought it back to life.
What is a Data Lake?
Data lake refers to a centralized data repository that enables users to store a large volume of raw data. In this case, raw data is instructed data which has not been exposed to any scientific manipulation or analysis. Sources of data lake data are varied, so are the users. Features that make data lakes popular is that it accepts data from all sources, supports different types of data, and can store data in various forms.
The highlight of data lakes in the 2019 Gartner Report focussed on the improvement of Data Lakes. In the past, data lakes were treated as a dumping site for raw data with no vision. Organizations reported that large volumes of raw data stored in data lakes, being useless to the organization. In this year’s report, data lakes are being exploited as part of an organization’s data management strategy. One way of achieving this strategy is complementing data lakes with the data warehouse. This is a shift from the past, where data lakes were considered as a replacement to data warehouses.
Some of the data architecture trends highlighted includes:
- Rise of converging data platforms. These platforms accommodate more data from several sources. Further, converged data platforms enable data scientists to mirror the data repository from different data centres
- Big data projects will continue to utilize RDBMS, which will lead to its growth
- Due to the increasing number of the managed data warehouse, ETL, and the majority of warehouse services will be outsourced from third party vendors
- Increased growth of cloud-based data architects
- Data warehouses will be replaced by hook and spark as the rise of big data continues
- Increased speed, flexibility, and scale expected to match the needs of machine learning, AI, and data architecture
- The in-memory database continues to dominate data storage as the business’s speed of execution increase
- Data lakes continue to gain prominence as it is viewed as an umbrella of data warehouse and data marts
Cloud deployment continues to rise
Cloud computing is another area that dominated the 2019 market guide. The primary motivators for the adoption of cloud deployment are many, with the dominant factors being nimble and reduced management costs. These factors are the motivators behind the adoption of the cloud data warehouse as opposed to building a data center. Snowflake is the dominant vendor of a cloud data warehouse and a major competitor to Microsoft’s Azure. Some of the reasons for making cloud deployment popular include:
- On-demand services- with cloud deployment, customers can sign-up, pay, and start using cloud services without the involvement of an agent
- Resource pooling; with cloud deployment, different individuals, organizations, or departments can share resources such as data centers, servers, and storage
- Rapid expansion; Cloud deployment can enable an organization to scale your resource requirements
- Measured services; customers pay for what they use, in a ‘pay as you go’ model.
- Cloud deployment enables broad network access in an organization. This is key while scaling the operations to meet growing data needs