The Top 10 Fintech Trends for 2021 Top predictions according to experts

In this issue

The Top 10 Fintech Trends for 2021 2

Research from Gartner: Hype Cycle for Emerging Technologies, 2020 17

About 60 The Top 10 Fintech Trends for 2021

Executive Overview created in financial scenarios Expert interview: A dozen top such as payment, credit, wealth scientists in fields of artificial • Article Highlights management, and insurance. intelligence, blockchain, security To seek the future trends technology, digital analytics, FinTech is the technology-enabled of fintech, Ant Group has computing and IoT, and senior innovation in financial services invited professionals around practitioners in various innovation that could result in new business the world to intensively study fields of digital finance were models, applications, processes the financial applications of interviewed for their prospective or products with an associated emerging technologies, including insights into the future of fintech. material effect on the provision of , blockchain, financial services. (This definition computing and digital security. Empirical study: Ground-breaking was proposed by the Financial These Ant Group-led discussions applications and cases of Ant Stability Board in 2016 and has have been summarized below, Group, and other leading fintech become a global consensus.) As forming the top ten fintech trends companies were studied and a new round of technological of 2021 that we wish to share analyzed to verify the practicality revolution and industrial with related industries and the of technology trends in financial transformation arises, what types world. scenarios. of technical innovations will arise and how innovative technologies Three key methodologies used in • Key Conclusions will be applied in new financial the study process of this article scenarios brings new challenges. are as follows: To forecast development trends of fintech, it is necessary to As an early explorer of fintech, Industry survey: Results broaden the focuses to the entire Ant Group has been focusing presented are based on the latest digital economy. Undoubtedly, in on innovation for 16 years, fintech technology studies; fintech the next 10 to 20 years, fintech and constantly uses innovative policies issued by governments will be the core driving force for technologies to reimagine around the world; technology the development of global digital financial services. A series trends, application scenarios and economy; Innovation of financial of best practices have been best practices collected from technology has entered the era publicly available materials. of neutral trusted intelligence. 2 The following ten trends of • Trend 2: Preemptive • Technology trends in neutral technology development through security field 2025 are obtained based on the above two assertions. According • Trend 3: Interoperable • Trend 9: Hyper- to the fields to which these blockchain personalization technologies are related, the ten trends are classified as “trust”, • Technology trends in • Trend 10: Openness and “intelligence” and “neutral”. To be intelligence field transparency specific, finance, as an industry with extremely high requirements • Trend 4: Secure This article focuses on fintech, on security, will develop a full- collaborative intelligence and its key conclusions are the stack trust capability from chips predictions of technology trends, to operating systems, from the • Trend 5: Time-series derived from future financial data layer to the application layer graph computing scenarios. Expert interviews and end sides, and from hardware and empirical studies were the to software. On the other hand, • Trend 6: Continuous primary methodologies. the deepening application of intelligence artificial intelligence in financial Ant Group features this report to services will definitely drive future • Trend 7: Factor auto- promote the dialogue on fintech financial institutions to be neutral detection machine with the world through the all- and provide users with hyper- learning round and in-depth analysis on the personalized financial services. growing trends of global fintech. • Trend 8: Knowledge graph • Technology trends in trust and multi-modal learning Below is Hype Cycle for Emerging field Technologies, 2020 from Gartner.

• Trend 1: Full-stack trust

Figure 1: The Top 10 Fintech Trends for 2021

Source Ant Group 2020

3 Trend 1: Full-stack trust includes: processing chips, the demand for the zero-trust operating system kernel, data architecture is becoming Overview layer, application layer, and stronger as security awareness endpoint. From a software increases. By using the principle • Finance is the core of an process perspective, it has of “never trust, always verify” economy, while trust is the to be applied from system of zero-trust architecture, cornerstone of finance. boot to runtime, and from each layer of software and Trust consists of both the production to operation. hardware architecture needs to continuous offering and Therefore, a financial be implemented with security high reliability of services. system can survive extreme protection, and get upgraded Therefore, the financial environmental challenges by continuously with the evolution of industry must ensure it providing this form of full an online environment. Moreover, is trustworthy, from a stack trust methodology. it has to prepare for the situation technology perspective, for that any layer can be broken both clients and their funds. Descriptions through while no data theft happens. The system should be • Financial systems must be The rapid development and able to locate the intrusion and designed with architectural iteration of technology poses turn itself into a defense mode trust from day one. higher requirements on current preemptively. Theoretically this design security systems. For financial principle applies in the services where IT system whole system stack which security is extremely important,

Figure 2 Hype Cycle for Emerging Technologies, 2020

Gartner: Hype Cycle for Emerging Technologies, Published 24 July 2020, G00450415

4 1. In the infrastructure layer, and access management replaces 4. Key Technologies: Integration trust in both software and the network as the new boundary of Trusted Software and hardware will be an essential of infrastructure and builds a Hardware, Full-stack Encryption, need new security defense architecture. Zero-Trust Architecture, Security With native capabilities of cloud, Container, Cryptography, Machine Traditional financial service such as API-izing, companies can Learning and Data Mining, etc. institutions (FSI) face a crucial perform unified authentication challenge of building a trust and authorization on identity Trend 2: Preemptive IT environment with extreme access, and grant different access security high cost, due to the talent to different people in a dynamic scarcity in hardware security environment to achieve fine- Overview and trusted computing areas. grained management, enabling In the future, underlying IT correct, safe and convenient Security environment for infrastructure hardware tends access to the correct resources companies around the globe has to leverage security chips as the for anyone at anytime and been increasingly severe these root of trust for integrated secure anywhere. years, with security incidents infrastructure capabilities. Upon constantly emerging. Traditional the security chips, it can create 3. At application layer, data security strategies are relatively BootLoader, operation system, encryption in its whole lifecycle passive in defense architectural middleware and other trusted is becoming the fundamental design, with strong dependency systems. In such ways, a full- requirement on the description of the known stack trusted system can be built risks, and thus cannot effectively for virtual machines, containers, Data protection through and promptly defend against new business layer, and user cryptography has been a types of attacks. For financial authentication. The immutable traditional approach, and the scenarios with extremely high nature of hardware empowers application has expanded from sensitivity to security, such hardware-based cryptograph data encryption to all areas static passive defense mode with stronger reliability. And in the entire lifecycle of the lacks agility and must be encrypted computing and other system. At present, for simple upgraded to dynamic preemptive tools that ensure software security models and algorithms, mode through continuous risk security will equip organizations cryptography-based ciphertext inspection, anomaly detection, with differentiated security computing tends to be stable and and risk assessment. capabilities. Therefore, to a mature, while bridging the gap certain extent, future competition with plain text computing in terms Descriptions in security capacity is also of performance while reaching competition in the integration its bottleneck at the same time. Traditional risk management ability of software and hardware. General security in cryptography platforms are generally built upon is becoming increasingly the basis of real-time intelligent 2. At system layer, identity-based difficult. The future development decision-making system, fully zero-trust architecture will be the of ciphertext computing activating the potential of mainstream technology will be based on the hardware to compute as many customization and optimization characteristics and rules as The network boundary of for specific needs, with the help of possible in limited response business systems has been cryptography, machine learning, time, realizing “perfect” balance greatly blurred, and not all access data mining, and the integration between disturbance and capital will be controlled by physically of software and hardware. loss. While on the opposite deployed gateways. Zero-trust side, underground industries architecture based on identity are already able to study the

5 prevention system of various to the game theory of dynamic Meanwhile, daily security financial business scenarios adversarial system, with “self- prevention should no longer through trial and error at a very proving” ability enabled. be confined to distinguishing low cost, performing large-scale between routine and malice. attacks once vulnerabilities are In the early stages, it is crucial It is also needed to accurately exposed. to detect underground entities identify and completely reveal and analyze and track abnormal and interpret the intention behind 1. Preemptive and dynamic operations before deploying the behavior. At the foundations, adversarial game theory defense according to potential highly tangible e-KYC and e-KYB malicious moves. Technology (electronic Know Your Business), To turn the tables on the attack breakthroughs in large-scale e-KYA (electronic Know Your and defense will mean to anomaly detection, augmented Attention) are to be enabled. revolutionize the theoretical basis analytics, group tracking and With knowledge graph deepening of the technical system from the dissemination, and dynamic industrial knowledge and building theory of risk decision making, decision-making are thus needed. risk expertise, core capabilities of financial security will thereby

Figure 3. Continuous Detection

Source: Ant Group 2020

6 1Data from Ant Group business practices be established. Current industry- technical standards are not exactly 1. Technical upgrade of leading preemptive defense unified, different blockchains interoperability driven by technology for financial security can be different in data formats booming demand could reduce capital loss rate to or open interfaces, setting up 0.64/10,000,0001, materially obstacles for the chain-based Based upon the current state enhancing the capability of interaction between companies of industry development and financial security. that deploy different blockchains. the accumulation of existing For financial industry, whose technology capabilities, 2. Real-life activation is key core value comes from asset blockchain has stepped into circulation among industries, a phase of rapid growth in In the above context, the entities and partners through interoperability demand. Mature security industry has reached a financial services, interoperability industry applications have moved consensus that real-life activation across chains become especially from interaction within a single is the key. Security compliance significant. To tap the true value of chain to collaboration among is an important assurance. blockchain, implementing cross- different chains, promoting Nevertheless, the level of entity interconnection, forming larger-scale digital asset value security is ultimately decided opener ecosystem, and facilitating circulation and encouraging by the efficacy in confronting more convenient value exchange, the process towards the goal actual security threats. Domestic interoperability technology will be of all-chain interconnection. large-scale security drills in key a necessity. With regard to standards, industries have already shed interoperability technology light on the necessity of real- Descriptions has developed a multi-level world exercises of attack and and multi-module standard defense. Facing the complex Blockchain has been enjoying architecture prototype. Some security situation, it is the real- explosive growth over the last application-ready sub-modules world activation and testing that few years, with applications have established projects in examine potential security risks gradually moving from proof- relevant standards organizations. among the complex network of-concept towards commercial The unification of interoperability system and substantially boost adoption in industries covering technology architecture and security technologies and services. healthcare, government, finance, further upgrades of its core Overall, with strong demands for education, transportation, techniques are expected to spawn the effectiveness in real-world telecommunication, media & a production-level platform that activities, security technologies entertainment, etc. Current can brace the real demand of all- can truly flourish. exploration is mainly focused on chain interconnection. the application in certain niche 3. Key Technologies: Chaos markets, facilitating business 2. Typical scenarios Engineering, Formal Verification, collaboration within the “local Attack and Defense Drills, area network”. With the large-scale Examples of the many scenarios Augmented Analytics, and implementation of commercial that adopt interoperability Knowledge Graph, etc. applications, existing local digital technology in cross-domain asset circulation in various collaboration: Trend 3: Interoperable scenarios can no longer meet blockchain the future development needs • Supply chain finance of the digital economy. Breaking Overview the restriction on interoperability Supply chain finance involves between different blockchains asset and capital circulation The development of blockchain to enable deep interconnection among multiple phases technology gives rise to a wide across industries and markets and participants including variety of blockchain types. As the becomes the key challenge to core enterprises, suppliers, overcome such a bottleneck. dealers, logistics and financial 1Data from Ant Group business practices 7 institutions, etc. When they and retailers demands Secure collaborative intelligence are on different blockchain cross-process visibility. provides the key capability that platforms, interoperability Interoperability enables integrates multi-party information technology can act as an data traceability by linking based on privacy protection, in enabler connecting cross- business processes on order to make the best use of entity businesses. different chains, helping data for open finance. to improve rural financial • Cross-border trade services. Descriptions

Complex import and export 3. Key Technologies: Privacy 1. Secure collaborative processes, and distribution Security Computing, High- intelligence empowers multi- of businesses on different Performance Transaction, Relay party credit to achieve inclusive platforms amplify the Technology, Interoperability finance difficulty in tracking assets. Transaction, Interoperability Capabilities of interoperability Governance, Naming and Secure collaborative intelligence in bridging business modules Addressing, Heterogeneous aims to make the best use of on different platforms and Protocol, etc. data by the fusion of multi-party improving the transparency information and learning based in between becomes a critical Trend 4: Secure on privacy protection. demand. collaborative intelligence Secure collaborative intelligence • Rural finance Overview helps credit businesses:

The development of rural Open finance makes it possible • Offer extremely quick loans in finance relies on data for finance companies to a few minutes2 online. circulation of agricultural collaborate within the industry operations. Taking the or across industries. Multi- • Achieve multi-party joint traceability of agricultural dimensional data sharing with modeling and enhance the products as an example, privacy protection is a core model performance in order full-process tracking of prerequisite of open finance. to improve credit decision- agricultural products from making while effectively provenance to storage protecting data privacy.

Figure 4: Classic Interoperability Technology Diagram

Source: Ant Group (2020)

8 2Data from Ant Group business practices Figure 5: Joint risk management helps enterprises improve accuracy of system and reduce monthly capital loss

Source: Ant Group (2020)

Related Technologies: 2. Secure collaborative 3. Key Technologies: Multi-party intelligence helps financial Computing, Trusted Execution • Both multi-party computation institutions establish joint risk Environment, Differential Privacy, and trusted execution management network Federated Learning, etc. environment that are types of confidential computing Secure collaborative intelligence Trend 5: Time-series technologies play an facilitates risk management in: graph computing important role in secure collaborative intelligence. By • Onlining audit to avoid human Overview combining both of these two error, reduce labor cost and technologies, it can further improve efficiency. • Financial risk decision-making improve the performance of is an escalating process multi-party joint model. • Building standardized with continuous adversarial databases, models and rules moves. Growing number • Differential privacy is also one of risk characteristics for of risks reflect the group of the confidential computing different risk types. nature and concealment of technologies that has been the attack. Since it is hardly used in secure collaborative • Achieving hyper-personalized possible to make accurate intelligence. Multi-party risk management through decisions on the basis of a computation and trusted joint decision-making, single event or behavior, more execution environment ensure modeling and defensing types of relationships such the security of the computing as transaction, capital, and process. Differential privacy Secure collaborative intelligence location are needed to form ensures the security of the assists ecosystem partners in an overall insight to identify computing result, which joining forces to establish risk deep-seated risks and build means the original data management network and to a monitoring system. Graph cannot be derived from the build anti-fraud risk management computing is the optimal computing results. models. While increasing solution to this complex transaction volume and reducing financial scenario. Secure collaborative intelligence capital loss, the accuracy of technology will help to increase risk identification has been • The construction of graph credit approval rate and credit substantially improved. computing is leveraging limit for rural finance and SMEs. cross-industry multi- dimensional heterogeneous

9 data in replacement of single Figure 6: Components and practical results of time-series graph 3 financial behavioral data, computing and using time-series graph data instead of single time- slice graph data. Upon such upgrade, artificial intelligence is imposed to make more comprehensive decisions.

Descriptions

1. Time-series graph computing enables financial risk management with more abundant and more precise decision-making

Time-series graph computing incorporates graph computing engine and graph storage,

especially in the scenario of Source: Ant Group (2020) intelligent risk management of mega financial assets, which is an example that enjoys great practical • Realize graph data inference • Dynamic time-series of graph improvement. and deep learning training of is empowered by dynamic 10 billion nodes and trillion time-series graph mining, Time-series graph computing edges. dynamic graph learning, time- helps4: series analysis, time-series • Transform data construction modeling, etc. • Introduce multi-dimensional from time-slice-based and multi-time-series data to time-series-based, to For investment risk prediction, to make decisions, with predict behaviors within a financial institutions can trillion-grade data in a single specific timeframe during comprehensively judge the scenario. the past. Besides, the data impact on the basis of historical collection includes but is not transaction data over a period of • Realize millisecond level limited to multi-dimensional time and cross-industry multi- real-time graph construction heterogeneous data other dimensional heterogeneous data. and query under million-level than financial transaction At the same time, time-series concurrency. data. intelligence can also be used to identify transaction risks, help • Achieve second-level group Related Technologies reduce fraud, and decrease association decision-making personal and corporate financial analysis. • Graph database, graph losses. In corporate credit, it can computing, graph learning optimize customer rating models • Realize intelligent upgrade framework, dynamic time- and reduce credit management from single point modeling to series of graph, construction risks. real-time networking. and reasoning of knowledge graph.

3,4Data from Ant Group business practices

10 Time-series graph computing also business condition and lowering Descriptions helps financial experts to build nonperforming loan ratio. their experience as rules. With 1. Intelligent asset and liability robust capacity in intelligent risk 3. Key Technologies: Graph management supports management, it is expected to be Database, Graph Computing, financial institutions to widely adopted. Graph Learning Framework, predict liquidity accurately Dynamic Time-series of Graph, 2. Enterprise knowledge graph Knowledge Graph Intelligent asset and liability helps decrease financial risks management will help: caused by information asymmetry Trend 6: Continuous intelligence • Automate asset and liability Intricated relationship graph management process by is formed by capital chains Overview performing accurate cash flow between businesses or between forecast. business and individual. • Asset and liability Financial institutions are facing management is one of the • Predict the growth of crucial challenges in addressing most complicated parts in business assets in the future complex customer relationships, financial industry because and observe the key growth identifying deep-seated risks and of the volume of funds and opportunities to support building monitoring systems. diversity of business. Its highly effective operation for daily asset management enterprises. Enterprise knowledge highly relies on human graph can help decrease experience. Liquidity risk • Improve portfolio investment financial risks caused by is commonly managed by strategies. information asymmetry. With reserving large size of funds, the empowerment of NLP, however, it may lead to the Related Technologies: machine learning, dynamic problem of idle funds and continuous analysis, causal may also increase operating • Time-series prediction based inference, relevance inference, cost. Therefore, it is difficult on macro, meso and micro etc., unstructured data can be for financial institutions to finance performance and structuralized and relevant data balance liquidity risk and fund prediction of user behavior is connected through related utilization rate at the same based on digital analytics. dimensions. Such dimensions time. are based on publicly available • Financial price prediction and information analysis, including • Continuous intelligence market judgement based on but not limited to company with digital analytics and AI digital analytics. shareholders/historical establishes intelligent asset shareholders, investment and liability management • Factor auto-detection based status, debts and claims, model, which implements on genetic programming, court announcement, business continuous automatic model interpretability and competition, etc. Ultimately, the iteration and evolution. It return attribution. enterprise knowledge graph will supports financial institutions be formed to understand and to achieve effective asset and • Multi-objective optimization analyze significant risks currently liability management, improve strategy. faced by a company and thus decision-making process, implement risk warning. reduce liquidity risk and Financial institutions deal with increase fund utilization rate hundreds of billions of funds5 Enterprise knowledge graph by using liquidity forecast, on a daily basis. By using a can also be used to assist financial trading strategies set of continuous intelligence in predicting a company’s and trading robots. technologies in asset and liability 11

5Data from Ant Group business practices management, it can assist to Intelligent trading robot Descriptions handle the change of liquidity significantly improves trading in a more accurate way, identify efficiency in interbank financial 1. Machine learning with factor liquidity risk in advance, realize market. At present, it can be auto-detection will usher in a unmanned management of net applied in REPO and spot trade penetration in various financial debit cap and release workload scenarios. It saves the time of decision-making scenarios for large-scale of position transaction matching, improves managers. In a word, continuous transaction efficiency, expands Massive financial records of intelligence helps to substantially transaction volume and reduces transaction, customer, billing improves the efficiency of asset transaction risk, thus further and remittance provide machine and liability management for improves the effectiveness of learning with ideal data context, financial institutions with full interbank financial transactions. because the effectiveness of automation. machine learning is highly 3. Key Technologies: Natural dependent on training datasets. 2. Intelligent trading robots Language Processing, Machine Factor auto-detection can replaces manpower in interbank Learning, Genetic Programming, simultaneously explore value of financial transactions in order etc. external life-related data and to substantially improve trading massive financial records, and efficiency Trend 7: Factor auto- can provide more value than detection machine feature engineering based on Intelligent trading robots can learning human experience. Factors are implement automatic inquiry, of great importance in financial intelligent interaction, trading Overview field. Factor analysis is the core willingness identification, capability in investment robo- transaction matching, transaction Factor analysis in traditional advisor, intelligent insurance verification and full automation of machine learning relies on consultant, credit scoring, transaction execution process. expert experience and models. intelligent risk management, and With factor auto-detection, RPA, etc. Especially in investment Related Technologies: machine learning is able to robo-advisor and intelligent detect factors automatically in insurance consultant, factor auto- • Serial chat and transaction advance, realizing self-adaption detection helps to find the leading matching based on NLP and self-learning without indicators that influence the price technology. manual intervention, marking a of financial products in order to disruptive breakthrough of AI. greatly improve the capability of • Timely transaction statistics, Machine learning with factor financial decision-making. transaction rule extraction auto-detection will be applied in and real-time market analysis large-scale financial scenarios In traditional machine learning, based on digital analytics in five years, reducing subjective influencing factors are based technology. analysis and strengthening the on manual intervention and impact of objective facts on thus guide decision-making in • Flexible expansion and financial risk control, thereby a subjective way with human distributed training greatly raising the level of thinking, which is adverse to the capabilities based on financial intelligent decision-making. risk management and innovation cloud. of the business. The good news

12 is that machine learning with Descriptions factor auto-detection will usher in a penetration in various financial 1. Knowledge graph levels up decision-making scenarios within efficiency for intelligent finance the next 5 years. By analyzing multi-dimensional historical data Taking intelligent claims as and adopting genetic algorithm, an example, knowledge graph it automatically and continuously facilitates to: generates new valid factors and eliminates invalid ones, • Extract heterogeneous ensuring the factors are diverse information such as text, and explainable. In such cases, images, audios and videos subjective analysis is reduced from evidence to identify and the impact of objective potential frauds. facts on financial risk control is Intelligent claims assist in strengthened. • Integrate the above saving labor cost, speeding up structurized information into claims process, reducing false 2. Key Technologies: Machine knowledge graph; accumulate compensation and enhancing Learning, Factor Auto-detection, rules on graph platform and customer experience. Augmented Analytics use rule inference engine to obtain advice on business 2. Key Technologies: Natural Trend 8: Knowledge decisions. Language Processing, graph and multi-modal Knowledge Graph, Image learning Related Technologies: Understanding, Multimedia Content Understanding, Secure

Collaborative Intelligence, Robotic Overview • Knowledge graph, including knowledge construction, Process Automation Traditional financial business knowledge extraction, with complex processing knowledge fusion, knowledge Trend 9: Hyper- flows requires major human assessment and knowledge personalization involvement. Knowledge graph, inference, etc. multi-modal machine learning, Overview secure collaborative intelligence • Secure collaborative and other key technologies can intelligence, helping realize As Internet technology drives be applied to offer intelligent the principle of minimizing financial service offline to online, financial services. Taking the use of data, that is, users care about customer intelligent underwriting and only obtaining relevant, experience and personalized intelligent claims in insurance as necessary, and encrypted services more than ever. In the examples, the key technologies feature information for model future, financial service must mentioned above can elevate the training. Therefore, secure be integrated with business level of automation in insurance collaborative intelligence scenarios to provide real-time services, reducing human technology not only protects services. With widespread involvement and optimizing personal privacy, but also application of digital analytics customer experience. ensures the system’s feature and artificial intelligence processing secrets will not technologies in financial industry, leak, preventing potential financial institutions can provide frauds. users with hyper-personalized

13 financial services such as robo- insights from consumer behavior • Intelligent customer service advisor in investment, intelligent data in order to offer hyper- Chatbot and virtual assistant, insurance consultant, intelligent personalized financial service, etc. provide conversational customer service, intelligent meeting diverse needs of existing interfaces between financial marketing and so on. In the next customers and new customers. institutions and users to five years, digital analytics and establish an authenticate artificial intelligence technologies 2. User intelligence leads to communication based will be applied in more financial offering hyper-personalized on natural language scenarios and hyper-personalized financial services in all business understanding and generation financial services will go scenarios technologies. Customers will mainstream. receive high standard and User intelligence combines tailored services from well- Descriptions digital analytics and artificial trained experts. intelligence technologies to 1. Change of user expectations accurately match users’ needs • Intelligent marketing drives popularization of with financial services through Through machine learning digital analytics and artificial searching recommendation and algorithms, financial intelligence technologies for service content understanding, institutions thereby can financial industry etc. In the next five years, user achieve personalized and intelligence will assist hyper- precise marketing service and While financial service is moving personalized service to be applied further optimize marketing from offline to online, its service to all business scenarios, making effectiveness, by analyzing target, service mode and service everyone to have financial service transaction, consumption, focus all change significantly. that meet his/her unique financial social activities, credit, etc. needs. Among all the scenarios, • Financial service target typical ones include: 3. Key Technologies: User gradually moves from large Behavior Analysis Algorithms, enterprises and high-net- • Robo-advisor in investment NLP, Machine learning and worth individuals to SMEs and Through AI algorithms Human-Machine Interactive the common majorities. and related products, it Technologies, Automatic Speech can provide personalized Recognition (ASR) and Text To • Financial service mode decision-making advice Speech (TTS). progressively changes from to users based on his/ B2C to C2B, from manual her investment goal and Trend 10: Openness and processing to online touchless risk preference in order to transparency service, and further develops improve risk adjusted return into personalized intelligent of investment. Overview real-time service mode. • Intelligent insurance • How to enhance users’ trust • Financial service focus goes consultant becomes a top priority from organizations to users. By using digital analytics when developing advanced More and more financial combined with intelligent technologies. institutions focus on users’ recommendation technology, specific experience and care insurance consultant can • While digitalization evolves, about individual user needs. offer personalized product digital ethics and privacy selection, payment modes, draw growing attention By using digital analytics and coverage analysis and gap from individuals, companies artificial intelligence technologies, assessment services. and governments. Users’ financial institutions can obtain increasing awareness

14 of personal information • Performing service • Stick to the “openness and protection arouses calling discrimination and transparency” principle; for clearer decision making information counterfeiting Clarify the policy making process and accessibility under the privilege of digital process and disclose policy of corporate policies. On analytics. specifications on the home regulator’s part, related page of their applications, laws and regulations should To put an end to this, it is Fintech improving users’ experience be introduced for securing companies’ duty to develop a in the process of sharing data personal information. series of open and transparent and enabling users to exit codes of digital ethics and update without effort when necessary. • In the vision of an inclusive privacy policies and specifications finance, financial services on a timely manner. 2. Openness and transparency should be equally available becomes a consensus in the to all, and fintech companies Requirements for fintechs include construction of financial digital should adhere to the neutral but not limited to: ethics principle in providing open and unified services. • Establish digital ethics Financial digital ethics should guidelines and infuse them be deeply rooted in fintech • Technology empowers into company’s values, companies to protect user financial services and is technological visions and privacy, maintain user trust and deemed to benefit the strategies; create value for users themselves majority of the world. rather than generate revenue for • Disclose privacy policies, companies. It should be every Descriptions clarifying in which way does fintech company’s responsibility the company collect, store, and commitment to provide 1. Financial industry requires protect, use and disclose user financial services on the basis of financial digital ethics information, allowing users to digital ethics. better understand and control There are fintechs that unethically the data sharing and usage; As the impact of technological use customer privacy for Meanwhile, keep the privacy innovation is broadening and profiteering in the following ways: policy brief and specific, deepening globally, companies readable and accessible for should use technologies with • Forcing authorization and users. good intentions, and in socially claiming undue authorities responsible and ethical manners in customers’ ignorance of to ensure technologies truly financial regulations; benefit the majorities.

• Encouraging frequent transaction and excessive financial leverage, using customers’ data and their human weakness;

15 Appendix: Emerging FinTech Self-Assessment Checklist

Self Assessment Top 10 FinTech Monitoring Planning Pilot Deployment Trends (1 pt) (2 pts) (3 pts) (4 pts) Trend 1: Full-stack m m m m trust Trust Trend 2: Preemptive m m m m security Trend 3: m m m m Interoperable blockchain Trend 4: Secure m m m m collaborative intelligence Trend 5: Time- m m m m series graph computing Trend 6: Continuous m m m m intelligence Intelligence Trend 7: Factor m m m m auto-detection machine learning Trend 8: Knowledge m m m m graph and multi- modal learning Trend 9: Hyper- m m m m personalization Neutral Trend 10: Openness m m m m and transparency

Total Score (10-40)

Note: the above self-assessment checklist and scoring shall be used as a reference only, and scoring isn’t equal to technology capability. Scores of above 30 are ideal.

Source: Ant Group

16 Research from Gartner Hype Cycle for Emerging Technologies, 2020

Our 2020 Hype Cycle highlights • Explore potential • Composite architectures emerging technologies that will opportunities. significantly affect business, society • Algorithmic trust and people over the next five to 10 • Plan to exploit these years. It includes technologies that technologies as they become • Beyond silicon enable a composable enterprise, commercially viable. aspire to regain society’s trust in • Formative artificial technology and alter the state of Technology innovation has intelligence (AI) your brain. become the key to competitive differentiation and is the catalyst • Digital me Analysis for transforming many industries. What You Need to Know Breakthrough technologies are The Hype Cycle As a technology innovation leader, continually appearing, challenging The Hype Cycle for Emerging CTO or CIO, you must stay up to even the most innovative to Technologies is unique among date with emerging technologies keep up. Your focus on digital Gartner Hype Cycles because to determine their impact on your business transformation means it distils insights from more industry and the opportunities they you must cut through the hype than 1,700 technologies that present for your organization. This surrounding these technologies. Gartner profiles into a succinct year brings exciting opportunities The innovation profiles (IPs) set of “must know” emerging for you to explore in your search highlighted in this research technologies and trends. The for technology-enabled business provide guidance on the business technologies on this Hype transformation. If you’re an early impact of emerging technologies Cycle are selected for the adopter, you can use this Hype and recommendations for how transformational or high benefits, Cycle as a starting point to: to use them to drive competitive and the breadth of impact differentiation. across business and society. • Understand the technologies Because of its focus on emerging you need to watch during This year, the emerging technologies, this Hype Cycle the five- to 10-year planning technologies on our Hype Cycle features only those trends on the horizon. fall into five clear trends: first half of the cycle. It tends to introduce technologies that • Private 5G haven’t featured in previous iterations. Limited space means • Embedded AI that we have had to retire most of the technologies that we • Low-cost single-board highlighted in the 2019 version computers at the edge of this research. The retired technologies remain important Algorithmic trust. In recent and are included in other Hype years, organizations have Cycle research (see the Off the exposed personal data, used Hype Cycle section). biased AI models and flooded the internet with fake news Trends in Emerging and videos, to name just a few is early mainstream, but it hasn’t Technologies issues. In response, a new trust yet reached the bottom of the This iteration of the Hype Cycle architecture is evolving, shifting Hype Cycle. The reason for this highlights five distinct trends that from trusting organizations to is that several implementation create highly adaptive solutions, trusting algorithms. Algorithmic models exist for BYOI. These explore the future of AI and trust models are replacing trust range from long-established rebuild trust in technology and models based on responsible social identities (for example, society. You should track these authorities. This is to ensure the Facebook and LinkedIn) to less five emerging technology trends. privacy and security of data, mature implementations (such the provenance of assets, and as identities) to emerging Composite architectures. the identity of individuals and decentralized (blockchain) Rapid business change and things. Algorithmic trust helps implementation models. The decentralization are driving the to ensure that organizations positioning/maturity is a trade- need for organizational agility will not be exposed to the risks off that reflects the various and custom user experiences. and costs of losing the trust of implementation models. The composable enterprise is their customers, employees and designed to respond to rapidly partners. Beyond silicon. Gordon Moore changing business needs with famously predicted that the packaged business capabilities To start rebuilding trust with number of transistors in a dense built on a flexible data fabric. your customers, employees and integrated circuit would double A composite architecture is partners, examine the following approximately every two years. implemented with solutions technologies: For more than 40 years, Moore’s composed of packaged business Law has guided the IT industry. As capabilities. Built-in intelligence • Secure access service edge technology approaches the physical is decentralized and extends (SASE) limits of silicon, new advanced outward to edge devices and the materials are creating breakthrough end user. • Differential privacy opportunities to make technologies faster and smaller. To make your organization more • Authenticated provenance agile, examine the following Explore the following critical technologies: • Bring your own identity (BYOI) technologies:

• Composable enterprise • Responsible AI • DNA computing and storage

• Packaged business • Explainable AI • Biodegradable sensors capabilities BYOI holds an unusual position • Carbon-based transistors • Data fabric in the Hype Cycle as its maturity 18 Formative AI. This refers to a humans provide models of target audience. However, this set of emerging AI and related individuals that can represent technology is required for access technologies that can dynamically people in both the physical and to public spaces and transport in change to respond to situational digital space. The way we interact (the Health Code app) and variances, hence the term with the digital world is also (the Aarogya Setu digital “formative.” Some of these changing, moving beyond the service), and hundreds of millions technologies enable application use of screens and keyboards of people in those countries are developers and user experience to a combination of interaction using it. We expect that both designers to create solutions using modalities (e.g., voice, vision technologies will reach the final AI-enabled tools. Other technologies and gesture), and even directly stage of the Hype Cycle in less enable the development of AI altering our brains. than two years. (see Figure 1 on models that can evolve dynamically page 20) to adapt over time. The most Track the following technologies: advanced of these technologies The Priority Matrix can generate novel models to solve Social distancing technologies The Priority Matrix maps the specific problems. (also known as contact-tracing benefit rating for each innovation apps) against the amount of time each To explore the boundaries innovation requires to achieve of AI, analyze the following • Health mainstream adoption. (See figure technologies: 2 on page 21) The benefit rating • Digital twin of the person provides an indicator of the • AI-augmented design potential of the innovation, but • Citizen twin the rating may not apply to all • AI-augmented development industries and organizations. So • Multiexperience identify which of the innovations • Ontologies and graphs offer significant potential benefits • Two-way BMI (brain machine to your organization based on • Small data interface) your own use cases. Then use this information to guide investment • Composite AI Two digital-me technologies decisions. Examine innovations are moving particularly quickly that offer more significant, near- • Adaptive machine learning through the Hype Cycle: health term benefits because they can (ML) passports and social distancing offer both strategic and tactical technologies. Both of these benefits. Explore innovations with • Self-supervised learning technologies are related to the longer-term benefits if they offer COVID-19 pandemic, which strategic value. We recommend • Generative AI partly explains their accelerated tracking technologies that are progression. important to your organization • Generative adversarial by creating a technology radar. networks Technologies rarely enter the Alternatively, use our Hype Cycle Hype Cycle at the point at which tool to create a customized Hype Digital me. Technology is social distancing technologies has Cycle for your organization. becoming more integrated with entered it, but this technology people to create opportunities has received extraordinary Emerging technologies are for digital representations attention in the media, mainly disruptive by nature, but the of ourselves. The COVID-19 because of privacy concerns. competitive advantage they pandemic has spawned health Health passport is also unusual provide isn’t yet well known or passports and social distancing because technologies rarely enter proven. Most will take more technologies designed to keep the Hype Cycle with a market than five years, and some more people safe. Digital twins of penetration of 5% to 20% of the than 10 years, to reach the 19 Figure 1. Hype Cycle for Emerging Technologies, 2020

Plateau of Productivity. But case. When a technology can important technologies. Most some technologies on the Hype perform in a particular use case technologies that we remove Cycle will mature in the near with reasonable quality, examine from this Hype Cycle continue term, so you must understand the other obstacles to deployment to be tracked on other Hype the opportunities these present, to determine the appropriate Cycles. Refer to Gartner’s broader particularly those with the deployment planning horizon. collection of Hype Cycles for potential for transformational or Obstacles may include cost, items of ongoing interest. high impact. regulation, social acceptance and nonfunctional requirements. We’ve removed many of the Most technologies have multiple technologies that appeared in the use cases. To determine Off the Hype Cycle 2019 version of this Hype Cycle, whether a technology will have The Hype Cycle for Emerging including: a significant impact on your Technologies is not a typical industry and organization, explore Gartner Hype Cycle. It draws from • 3D sensing cameras — each of the use cases. Prioritize an extremely broad spectrum Two Hype Cycles still those with the greatest potential of topics and we intend it to track this technology, benefit and prepare to launch be dynamic. It features many including “Hype Cycle for a proof-of-concept project to technologies for only a year or Sensing Technologies and demonstrate the feasibility of two, after which it doesn’t track Applications, 2020.” a technology for a specific use them to make room for other 20 Figure 2. Priority Matrix for Emerging Technologies, 2020

• 5G — Six Hype Cycles • Augmented intelligence — • Biochips — “Hype Cycle for still track this technology, Three Hype Cycles still track Sensing Technologies and including “Hype Cycle for this technology, including Applications, 2020” still Unified Communications and “Hype Cycle for Artificial tracks this technology. Collaboration, 2020.” Intelligence, 2020.” • Decentralized web — • AI cloud services — Three • Autonomous driving Level 4 “Hype Cycle for Blockchain Hype Cycles still track — “Hype Cycle for Automotive Technologies, 2020” still this technology, including Electronics, 2020” still tracks tracks this technology. “Hype Cycle for Artificial this technology. Intelligence, 2020.” • DigitalOps — “Hype Cycle • Autonomous driving Level 5 for Enterprise Architecture, • AR cloud — Two Hype Cycles — “Hype Cycle for Automotive 2020” still tracks this still track this technology, Electronics, 2020” still tracks technology. including “Hype Cycle for this technology. Edge Computing, 2020.” 21 • Edge AI — Five Hype Cycles • Low Earth orbit satellite these applications don’t address still track this technology, systems — Four Hype Cycles the problem of authenticating including “Hype Cycle for still track this technology, goods and content initially Artificial Intelligence, 2020.” including “Hype Cycle for recorded on the blockchain. The Enterprise Networking, 2020.” question remains “How do you • Edge analytics — Five know what you are tracking on Hype Cycles still track this • Personification — Three the blockchain is real to begin technology, including “Hype Hype Cycles still track this with”? The problem is made Cycle for Analytics and technology, including “Hype worse because on the blockchain, Business Intelligence, 2020” Cycle for Privacy, 2020.” garbage in means garbage forever, since it can never be • Emotion AI — Eight Hype • Synthetic data — Three modified or deleted due to the Cycles still track this Hype Cycles still track this immutable ledger. technology, including technology, including “Hype “Hype Cycle for Sensing Cycle for Data Science and Users considering blockchain Technologies and Machine Learning, 2020.” provenance applications Applications, 2020.” are aware of this limitation. • Transfer learning — “Hype Significantly, some regulators • Flying autonomous vehicles — Cycle for Data Science and Gartner has spoken with have Eight Hype Cycles still track Machine Learning, 2020” still also noted the problem, which this technology, including tracks this technology. they say must be addressed “Hype Cycle for Connected before blockchain can be used Vehicles and Smart Mobility, On the Rise to authenticate provenance 2020.” Authenticated Provenance of goods, such as food or Analysis By: Avivah Litan; pharmaceuticals. • Graph analytics — Three Svetlana Sicular Hype Cycles still track this Gartner believes provenance technology, including “Hype Definition: Authenticated authentication solutions will be Cycle for Analytics and provenance represents the in more demand in the coming Business Intelligence, 2020.” authentication of assets that can years, as users adopt blockchain be recorded and tracked on the for provenance applications. They • Immersive workspaces — blockchain. The provenance of will become increasingly aware of Two Hype Cycles still track these assets can later be digitally the need to digitally ‘certify’ the this technology, including verified by blockchain network first mile, or onboarding of the “Hype Cycle for the Digital participants. There are many goods or content being tracked on Workplace, 2020.” methods used to authenticate the the blockchain in the first place. provenance of assets, depending For now, that certification relies • Knowledge graphs — Two on their nature and whether they on manual audits or human trust. Hype Cycles still track are digital or physical goods. That is certainly not scalable. For this technology, including example, human fact checkers “Hype Cycle for the Digital Position and Adoption Speed cannot keep up with detecting Workplace, 2020.” Justification: Counterfeit physical fake content, despite the growth goods and fake digital content in independent English language • Light cargo delivery drones have become major national fact checks by more than 900% — Two Hype Cycles still track and health security threats at from January to March 2020. See this technology, including worse, and costly problems for market study “Types, “Hype Cycle for Drones and organizations at best. Blockchain Sources, and Claims of COVID-19 Mobile Robots, 2020.” provenance and asset tracking Misinformation.” applications are being adopted to address these issues, but 22 User Advice: CIOs, Enterprise Intelligent automation that Figma, InVision) soon, leading to Architects, Technology Innovation digitally authenticates and verifies major leaps in efficiency, quality, leaders and Application Leaders content provenance is clearly and time to market. At multiple responsible for applications and and urgently needed. The more companies, AI-augmented systems that generate and receive organizations that participate, design is already transforming goods and content within their the more effective the solutions. the customer experience organization should: This is because content rarely through decision support and stays within the confines of personalization in CX products • Adopt technologies that the environment in which it is and site builder platforms like can digitally authenticate produced, so solving this thorny B12 have added AI to assemble and verify provenance to fake goods and content problem content and best practices for prevent fake or altered goods very much depends on the growth your business type in under a and content from being of the network, including the minute. distributed and consumed. authenticators and the verifiers that adopt the solutions. User Advice: Application leaders • Work with your data scientists should: and IT teams to establish and Benefit Rating: High track provenance of goods • Monitor developments and content your organization Market Penetration: Less than in AI-augmented design, produces and consumes, 1% of target audience specifically at Adobe, followed using supporting technologies by InVision. like Blockchain, AI and factory Maturity: Embryonic component quality assurance • Prepare digital product testing. Sample Vendors: IBM; ThinkIQ teams for the emergence of AI-augmented design, • Work with peers and industry AI-Augmented Design first through design-to-code groups to form networks Analysis By: Brent Stewart technology, followed by bots of active participants who that produce high-fidelity can collectively and more Definition: AI-augmented design screen designs and written effectively combat fake goods is the use of artificial intelligence content. and content. Start by fixing (AI), machine learning and natural the problem in your own language processing technologies • Transition the role of humans organization. to automatically generate, and in the design process from evolve via machine learning, user production-level creators to Business Impact: The good news flows, screen designs, content strategic curators. is that there are some emerging and presentation layer code for solutions on the market, for digital products. Business Impact: The potential example that rely on spectral business impact of AI-augmented imaging, AI models, and factory Position and Adoption design is tremendous. Imagine quality assurance testing to Speed Justification: AI- the following scenario for creating authenticate the provenance of a augmented design is in its an online store: goods or particles the technology infancy. Conceptually, the can decipher and understand. design community sees the • First, you tell the AI that you These types of technologies have bold, fascinating — and want an online store; the AI been applied to authenticating even frightening — future AI- automatically generates the diamonds, wheat supplies, augmented design will enable. standard structural elements chemotherapy drugs and Gartner expects to see AI at work of an online store from the electronic components, and are in the digital product design homepage to product detail getting promising results. platform market (Adobe Xd, templates to the shopping cart.

23 • Next, you apply your style Maturity: Embryonic there hasn’t been a compelling guide, giving the AI inputs need to lower costs and increase on color, typography, Sample Vendors: Adobe; InVision output. The development of iconography, photographic commercial computing using DNA style, etc. DNA Computing and Storage may change that, triggering a Analysis By: Nick Heudecker; next-generation breakthrough for • Next, you provide some Rajesh Kandaswamy DNA synthesis similar to what we inspiration to the AI by witnessed with DNA sequencing. indicating a set of stores you Definition: DNA computing Another barrier is access speeds would like to emulate. and storage uses DNA and and throughput rates. Current biochemistry to perform access and throughput for • Then, you hit submit and computation or storage instead of DNA technologies are orders of within minutes, the AI has silicon or quantum architectures. magnitude lower than traditional produced three high-fidelity Digital data is represented as technologies. Lastly, effective design directions for you to synthetic DNA strands, loosely and efficient processing methods evaluate and iterate upon. translating as memory and disk must be found, which are in traditional architectures, while currently the subject of multiple • Furthermore, every design enzymes provide the processing research organizations. element has an associated capabilities. DNA computing code component that is relies on code stored in DNA DNA computing advances today are updated as you tweak or strands and computing is done for rudimentary logic and they need curate the final design. through chemical reactions. to mature to handle complex logic and math. Further development is In a future powered by AI- Position and Adoption Speed also needed to make the computing augmented design, sites, apps Justification: DNA computing architectures reprogrammable, one and software will be generated and storage makes its debut of the key conveniences of silicon- in minutes rather than days, on the Hype Cycle after two based computers. weeks or months, and the triggering events. The first is the resulting designs will be based development of an end-to-end If these barriers can be overcome, on proven design principles to “DNA drive” prototype created DNA computing should advance ensure maximum usability and by Microsoft Research and steadily through the Hype Cycle accessibility. In this future, the the University of Washington, curve, reaching the Plateau of roles of production designer, UX demonstrating the viability Productivity in roughly 10 years. writer, and presentation layer of an all-in-one DNA storage As these technologies progress, it developer are no longer needed. solution. The second trigger is is likely that Gartner will split this Instead, UX practitioners will the successful storage of English- topic into two innovation profiles: only need to tweak AI-generated language Wikipedia as DNA by one for DNA-based storage and designs and presentation layer CATALOG, a startup in the DNA another for DNA-based compute. code to be ready for launch. As a computing space. At this early stage, we are result, UX teams will shrink and combining these topics for clarity. remaining practitioners will be For DNA computing and storage focused on research, strategy, to progress to the Plateau of User Advice: We recommend the and design curation (rather than Productivity, significant technical following: design creation). barriers must be overcome. First, the creation of synthetic DNA, the • Begin evaluating the viability Benefit Rating: Transformational medium used to store digital data of DNA-based storage by as DNA, must become much more gauging when storage prices Market Penetration: Less than efficient and cost-effective. Today, fall to three to four orders 1% of target audience synthetic DNA is almost entirely of magnitude the cost of used in life sciences research and tape archival, and when write 24 speeds reach the megabit per as filtering data like anomaly of standards and as importantly second range. detection or AI inferencing like the lack of security features will image recognition at the edge. make enterprise usage less likely. • Exploit early opportunities Based on a system-on-chip (SoC) As the opportunities grow so to use DNA data storage solution, single-board servers will demand, for a more secure for product-centric uses, are designed with the minimum solution and a more standardized such as embedding DNA capability to perform the tasks software environment for tags into products to ensure required. I/O interfaces will vary, developers. authenticity and provenance. but at a minimum will include a wired or wireless network. The User Advice: Evaluate the use • Monitor technology innovation operating environment will be of single-board edge servers in the DNA storage and based around a micro OS, VMs for edge projects where a large computing space around and containers to enable rapid number of low-cost devices cost and performance delivery of updates. will be required to provide data breakthroughs and venture processing, image recognition, capital investment for an Position and Adoption Speed voice recognition or AI inferencing appropriate time to begin Justification: Single-board capabilities. Expect this market proof of concept testing. servers at the edge are a relatively to evolve rapidly over the next few new development for processing years with improved performance Business Impact: As DNA data and delivering AI inferencing and new capabilities being rolled computing and storage at the edge. Initially, the market out at a rapid cadence. Choose matures, the impact will be was driven by the introduction of single-board edge servers that transformational for data very low-cost single-board general can be rolled out rapidly, without storage, processing parallelism purpose computers such as the skilled staff on-site, and that and computing efficiency. While Raspberry Pi. Now, the market can easily be managed and unsuitable for every computing has expanded with open-source updated in the field. Security task, DNA computing potentially microcontroller-based system should be built into the system lends itself to graph and machine like Arduino, and AI inferencing and potential vendors should learning inference, as well as systems such as the Texas be evaluated for security across unstructured search and digital Instruments BeagleBone AI, and all areas including, physical, signal processing. the NVIDIA Jetson Nano. data storage, communications, management, and updates. Benefit Rating: Transformational Unlike larger edge servers, which Integration with existing Internet are generally repackaged x86 of Things (IoT) and artificial Market Penetration: Less than servers, single-board edge servers intelligence (AI) frameworks 1% of target audience are fixed configuration single- should also be considered when board systems based on ARM selecting a single-board edge Maturity: Embryonic CPUs. Opportunities exist for other server. CPU architectures such as x86 Sample Vendors: Microsoft; Twist and RISC-V but the dominance Business Impact: Single-board Bioscience of ARM in the SoC space will edge servers can help enterprises make it difficult for other chip realize the potential of the large Low-Cost Single-Board architectures to meet the power pool of data that is generated at Computers at the Edge and performance requirements to the edge. The ability to use this Analysis By: Tony Harvey succeed in this space. data has significant potential to generate cost savings, for Definition: Low-cost single- While the profusion of vendors example, by allowing real-time board servers are small low- and low costs associated with the image processing to recognize cost general-purpose systems hardware make prototyping and faulty or damaged items on that perform functions such development very easy, the lack manufacturing lines. It also helps 25 develop new areas of business many organizations with limited of the ML solution do not that will be enabled through real- relevant data or where manual outweigh the costs of manual time data processing at the edge. labeling is prohibitively expensive. labeling or annotating of data. It is also a more fundamental However, it currently depends Enterprises that do not adopt problem in current AI, in which very much on the creativity of single-board edge servers may the learning of even simple highly experienced ML experts to find themselves left behind as tasks requires a huge amount of carefully design a self-supervised enterprises that successfully data, time and energy. In self- learning task, based on masking integrate these systems into their supervised learning, labels can be of available data, allowing a digital transformation strategy generated from relatively limited model to build up knowledge will lower their costs and deliver data. In essence, this is done by and representations that are new services to market faster. masking elements in the available meaningful to the business data (e.g., a part of an image, a problem at hand. Tool support Benefit Rating: High sensor reading in a time series, is still virtually absent, making a frame in a video or a word in implementation a knowledge- Market Penetration: 1% to 5% of a sentence) and then training a intensive and low-level coding target audience model to “predict” the missing exercise. element. Thus, the model learns, Maturity: Emerging for example, how one part relates Business Impact: The potential to another, how one situation impact and benefits of self- Sample Vendors: Coral; NVIDIA; (captured through video and/or supervised learning are very Raspberry Pi Foundation other sensors) typically precedes large, as it will extend the (Raspberry Pi); Texas Instruments or follows another, and which applicability of machine learning (TI) words often go together. In other to organizations that do not words, the model increasingly have the availability of large Self-Supervised Learning represents the concepts and datasets. It may also shorten Analysis By: Pieter den Hamer; their spatial, temporal or other training time and improve the Erick Brethenoux relations in a particular domain. robustness and accuracy of This model then can be used as models. Its relevancy is most Definition: Self-supervised learning a foundation to further fine-tune prominent in computer vision, is an approach to machine learning the model — using “transfer natural language processing, IoT in which labeled data is created learning,” for example — for analytics/continuous intelligence, from the data itself, without one or more specific tasks with robotics or other AI applications having to rely on external (human) practical relevancy. that rely on data that is typically supervisors that provide labels or unlabeled. For AI companies, feedback. It is inspired by the way User Advice: Self-supervised self-supervised learning has the humans learn through observation, learning is an important potential of bringing AI closer to gradually building up general candidate enabler for a next the way humans learn: mainly knowledge or “common sense” main phase in AI, overcoming the from observation, building up about concepts and their relations limitations and going beyond the general knowledge about the in the real world. current dominance of supervised world through abstractions and learning. Self-supervised learning then using this knowledge as a Position and Adoption Speed has only recently emerged from foundation for new learning tasks, Justification: Self-supervised academia and is currently only thus incrementally building up learning aims to overcome one practiced by a limited number ever more knowledge. of the biggest drawbacks of of innovative AI companies. In supervised learning: the need practice, it is worth considering Benefit Rating: Transformational to have access to typically large when available data volumes amounts of labeled data. This is are limited or when the benefits Market Penetration: Less than not only a practical problem in 1% of target audience 26 Maturity: Embryonic As these technologies have been implement their own health introduced in the past months, passport, creating a challenge for Sample Vendors: Craftworks; there will be rapid evolution in people to manage many different Facebook; Google; Microsoft this space. There are many social health passports. obstacles to these technologies, Health Passport most importantly is the Governments are also eager to Analysis By: Brian Burke; Arnold restriction of personal freedoms reopen travel for foreign visitors Gao and privacy. Social acceptance and a health passport may help of the technology will be very to achieve that if there is trust Definition: Health passports are much based on the culture of the between the issuing authority a pandemic/epidemic response society where it is introduced. and the destination country/ technology implemented as One large issue is being able region. The standards for mobile apps that indicate to provide some type of health maintaining a trusted code will the level of infection risk of passport to people without a evolve over time and may include the holder. They are used mobile phone — alternative periodic viral tests, antibody to gain access to buildings, methods are required. tests, quarantine history and supermarkets, restaurants, public travel history. Interoperability spaces and transportation. User Advice: Trust and will be a key requirement to use transparency will be key to a health passport for travel, but Position and Adoption Speed the acceptance of any health standards for interoperability are Justification: In February 2020, passport. Having clarity into nonexistent today. and WeChat worked with the algorithms used to generate government to launch a national the color code is of paramount Business Impact: Health “Health Code” in China, which importance. Simplicity is also passports would help to enable is required to gain access to a key attribute as people will all locations to begin accepting many public and private spaces not want to use different health visitors with a lower level of risk, and services. Health Code is passports to gain access to opening doors to end lockdowns widely used as a screening tool different locations and services. and helping to restore confidence to minimize the risk of COVID-19 Many people will view these and rebuild the global economy. transmission. It provides the user health passports as assurance This will be a great benefit to with a color QR code based on that they are at low risk of businesses and all organizations their designated health status: infection from the people around as a lack of confidence will Red is confirmed infected with them in public places. But many prevent or minimize use of COVID-19; Yellow should be in people will view these apps as services. Health passports will quarantine; and Green is free to limiting their freedom, even also have a positive effect on travel. Health Code checks are discriminatory. reducing new infections overall as very common, making it difficult simply moving around becomes to move without having a green Ideally, these health passports impossible without a green code. In India, travelers must be would be managed by public code, people in quarantine will marked “safe” on the Aarogya health services but there is a be effectively forced to stay Setu app for travel by rail and air. risk that organizations may take home, eliminating the need for In May 2020, the UAE launched things into their own hands if separate quarantine enforcement the ALHOSN UAE app which authorities don’t act quickly. technologies that are in use in also provides a unique QR color Employers, schools, airports some countries. code (red = infected, yellow = (among others) all have a keen quarantine, green = OK, gray = interest in providing a low risk Harmonized health passport not tested) but was not being environment for their employees systems will also be highly used to gain access to places at and visitors, and in fact may be desirable to enable international the time of writing. legally liable if they fail to do travel for both business and so. These organizations may pleasure. 27 Benefit Rating: Transformational In September 2019, Facebook and find ways to add functionality acquired the neural interface without added invasiveness. Market Penetration: 5% to 20% startup Ctrl-labs for over $500 of target audience million, and will work to include the An early application is from NYX technology as a computer interface Technologies, currently in beta Maturity: Emerging and in AR/VR consumer products testing phase, which aims to use using Facebook Reality Labs. neurotechnology to both monitor Sample Vendors: Alipay; Bizagi; and stimulate brain function and Circle Pass Enterprises; Folio; In 2017, DAPRA also awarded a improve sleep. Vottun; WeChat $65 million contract to develop a bidirectional BMI. User Advice: Enterprises should Bidirectional Brain-Machine be prepared for future creeping Interface In order to estimate the of bidirectional BMI devices into Analysis By: Sylvain Fabre; progress for bidirectional BMI, enterprises; BYOD may occur Annette Jump it is worth noting that a related before specific legislation is in earlier trend, smart wearables, place, so business leaders should: Definition: Bidirectional brain- experienced significant hype in machine interfaces (BMIs) are 2016 through 2018, particularly • Ensure customer safety brain-altering neural interfaces that fueled by interest in consumer and business security by enable two-way communication smart wearables devices and implementing data anonymity between a human brain and software. However, venture and privacy (beyond GDPR) computer or machine interface. capital investments in companies for brain-wearable data in Bidirectional BMIs allow not only developing smart wearable products. monitoring of the user’s EEG products and solutions decreased (electro encephalogram) and from $2.8 billion in 2018 to $1.6 • Highlight trade-offs when mental states, but also some action billion in January 2019 through promoting wellness solutions. to be taken to modify that state October 2019. This return to based on analytics and insights. 2017 investment levels highlights • Take responsibility: Set up Brain state modification occurs the shift in VC investors’ a steering board to monitor via noninvasive electro stimulation evaluation of opportunities and products sold to consumers through a head-mounted wearable, potential markets for some smart and provided for employees. or an invasive implant. When wearables. The 2019 decline in Preempt potential legal connected, these enable the IoB VC smart wearable investment liability by regularly reviewing (Internet of Brains). underscores the issues linked implanted wearables with smart wearable devices, to features and their use cases Position and Adoption Speed include high cost, slow consumer and deciding on what is Justification: It is still very early adoption, high drop-off rates for acceptable in terms of read/ days for bidirectional BMI. There some smart wearables, and the write from and to users’ are already applications of one- complexity of integration between brains. way BMI wearables, where the various data systems. focus is about monitoring the • Establish policies for state of the user or using the user Since bidirectional BMIs are more unauthorized implantables: intent to operate some external advanced and extreme form of While they cannot easily device — but without trying to wearable (in effect, an implant be removed, users may be externally modifying the user’s with bidirectional connectivity), prohibited from some roles mood. Some of these solutions the above trend provides some such as operating vehicles even measure the response guidance as to what needs to or machinery (as BYOD and attitude of consumers to occur to allow a wider adoption of bidirectional BMI implants products and companies. bidirectional BMI. Namely, it will would pose risks similar to need to become more affordable drugs or stimulants). 28 Business Impact: There are As a result, direct read-and-write from research labs. Commercial multiple form factors for access to brain activity creates applications have just started devices designed to be worn or many opportunities for workforce being explored. The algorithms implanted to sense the human enablement. It also provides new require a lot of manual tuning to body, such as smartwatches, vulnerabilities to individuals and make them perform in the desired head-mounted displays (HMDs), their companies by adding a manner, and development of ear-worn wearables or hearables, vector of attack and human factor the technology is constrained by wristbands, smart rings, smart issues such as altering users’ the extremely limited resources garments, smart contact lenses, perception of reality, or even their that have knowledge in this area. exoskeletons, implants and personality. As commercial applications ingestibles. Over the next three become more commonplace, the to 10 years, they will enable Benefit Rating: High technology will improve as the business use cases including: benefits are significant. authentication, access and Market Penetration: Less than payment; immersive analytics and 1% of target audience GANs can be used for both good workplace; and control of power and bad purposes. They are suits or exoskeletons. Maturity: Emerging commonly used to create images of people who don’t exist (deep What is unique about bidirectional Sample Vendors: BrainCo; fakes), to create fake political BMI is that it is a brain-altering Facebook; Kernel; Neuralink; videos, to compose music class of wearables/implantables. Neuroelectrics; NeuroMetrix; and poetry. In 2018, an image In addition to the use cases NYX; Omron Healthcare produced by a GAN was sold at mentioned above, we now look at an auction for $432,500. While bidirectional connectivity for the Generative Adversarial these “novelty” applications are brain. For example, stimulation Networks prominent, research is underway applied to boost alertness in Analysis By: Brian Burke to apply these algorithms to response to markers of fatigue far more valuable challenges in a worker’s EEG, or relaxing Definition: Generative adversarial such as generating marketing cortical currents applied to networks (GANs) are composed content, graphic designs, creating the brain of a teacher or nurse of two neural network models, simulated environments for showing signs of irritability. This a generator and a discriminator training autonomous vehicles and creates very specific ethical and that work together to create robots and generating synthetic security challenges, because original simulations of objects data to train neural networks and they are a direct interface to the such as videos, images, music to protect privacy. GANs are also human brain. and text (poetry, journalistic being used in inverse design to articles, marketing copy) that create targeted pharmaceutical Bidirectional BMIs are the front replicate authentic objects or compounds and materials with line of innovation that powers their pattern, style or essence specific properties. human augmentation. They are with varying degrees of quality or designed to exhibit some level realism. GANs can also be used User Advice: Technology of autonomy when connection in an inverse design process to innovation leaders in high-risk- to the internet is not available generate models of novel drug tolerant organizations should or desirable. They are also compounds or new materials with evaluate the potential for designed to learn using machine targeted properties. leveraging this technology today, learning (ML), interact with the and partner with universities environment around the wearer, Position and Adoption Speed to conduct proof of concepts enhance human abilities, and Justification: Originally proposed where the potential benefits connect humans to the Internet by Ian J. Goodfellow in 2014, this and drawbacks are significant. of Things (IoT) and the Internet technology is in a nascent state, Tech innovation leaders should of Brains (IoB). with most applications coming do their due diligence and 29 consider the fact that while the to generate a material with the to the point where they are ready core technologies are readily required properties. for industry use. Leveraging available in the public domain, advanced design and simulation the technology is brittle, resource Benefit Rating: Transformational principles, polymer science, and hungry and requires significant green technologies made this (and rare) AI skills. They should Market Penetration: Less than advance possible. also focus on other pressing 1% of target audience issues such as explainability, as Today, biodegradable sensors GANs are “black boxes” and there Maturity: Embryonic can be designed to perform a is no way to prove the accuracy of variety of specific functions. the objects produced other than Sample Vendors: Amazon; Apple; They operate as detectors by subjective methods. Autodesk; DeepMind; Google; for changes to pH, humidity, Insilico Medicine; Landing AI; oxidization, gasses, glucose, Business Impact: The powerful Microsoft; Neuromation; NVIDIA antibodies, and chemicals. idea is that deep neural network Others are manufactured as RFID classifiers can be modified to Biodegradable Sensors tags — with carbon electrodes generate realistic objects of Analysis By: Michael Shanler printed on paper. Some circuits the same type. GANs have the are printed to be used as potential to impact many creative Definition: Biodegradable repeaters for both active and activities from content creation sensors are thin-film sensors passive sensor technology. These (art, music, poetry, stories, manufactured using nontoxic sensors are often manufactured marketing copy, images and materials that can go into by embedding chips or video) to many types of design common waste streams. The sandwiching sensors in between (architecture, engineering, drug, primary application is for thin-film polylactic acid (PLA) fashion, graphic, industrial, microsensing for food monitoring. or dissolvable silicon, and are interior, landscape, lighting, Some of these sensors are produced using corn and potato process). GANs might also bioresorbable, meaning they starch. PLA and related biofilm be used to create simulations can be ingested. Others are and green plastics are harmless where actual data may be biocompatible, meaning they can and biodegrade over time. difficult to obtain (training data be implanted into medical devices Compositions comply with U.S. for machine learning) or pose or pharmaceutical products and EU food legislation and label a privacy risk (medical images before dissolving or harmlessly requirements. for health data) or be costly to passing after ingestion. produce (backgrounds for video The sensors embedded into games). GANs have the potential Position and Adoption Speed the material may not be fully to augment humans’ talents Justification: This is a new biodegradable, but they are for many creative tasks across innovation profile for 2020. designed using nontoxic materials many industries. GANs are part Biodegradable sensors are a that can exist within the human of a group of generative AI relatively old concept within body at low levels, even when methods (including variational academia, dating back to accumulating over time (such autoencoders [VAE], recurrent the 1950s, but few research as molybdenum, magnesium, neural networks [RNN] and institutions were able to zinc, silicon dioxide and reinforcement learning [RL]), manufacture and design them nitride). Some use RFID-related which are being used in inverse for the right price points for technologies. Others are powered design. In material science, use in products at scale until by the substrate or products in inverse design turns the material recently. Over the last five years, which they are embedded. The discovery process on its head multiple research institutions sensors often can operate for a by starting with defining the in Switzerland, the U.S., the few weeks before eroding. They properties of the target material U.K., Japan and Korea have are designed to go to waste in and analyzing the chemical space pushed biodegradable sensors traditional landfills. Most of these 30 sensor formats are smaller than grains and vegetables. This customer or distribution channel, a grain of rice; however, research activity must also include and have effects on smart organizations are actively potential impacts on product supply and logistics, with added miniaturizing them even further. margins and the cost of capabilities for measuring real- goods sold. These sensors time physical, chemical and Gartner has observed several can be affixed to the inside biologic functions. prototypes at companies and of outer packaging (cereal some initial use cases, but boxes), affixed to product These sensors could help beyond small commercial labels (such as biocompatible streamline the product life offerings (such as Proteus Digital RFID stickers on premium cycle and provide data for Health), the technology has yet to apples) or even embedded location, serialization, product be scaled to the masses. These into the products themselves quality, tampering and product sensors have a lot of potential to (inside ground beef). performance. There will be change the way food, retail and useful benefits from combining medical devices are monitored • CIOs and CTOs in the sensor data with informatics and and used; however, we only healthcare and life science operational systems for R&D, envision success for when used at space should evaluate quality, regulatory, manufacturing scale when manufacturers hit the bioabsorbable and and supply chain, or other right price points and margins. biocompatible iterations for specialized areas (such as the IoT and sensing potential clinical, diagnostics and safety). This is a newly commercialized for both drugs and devices. Specifically, these sensors can technology by a handful of CIOs and technology leaders dovetail with smart products, vendors; thus, Gartner places must evaluate the sensors IoT analytics and sensor-enabled this embryonic technology in the while accounting for the business models. These sensors Innovation Trigger phase. downstream device regulatory can also be used to support requirements (for example, smart manufacturing, as well User Advice: Evaluate the 510K class I, II, and III as adaptive supply chains and advantage of biodegradable submission) and determine distribution channels. sensors and how they may what is required to put them dovetail with smart product or into production. Benefit Rating: High Internet of Things (IoT)-enabled product strategies. Drivers could Before investing, life science Market Penetration: Less than be product quality, tracing, companies must outline with a 1% of target audience authenticity or performance. CIOs clear vision what is required to must also plan building in the make these sensors work in their Maturity: Embryonic IoT data ingestion and analytics highly regulated manufacturing, capabilities required to deliver supply chain and distribution Sample Vendors: c2renew; EPFL; business value from sensors. channels. Teams must determine ETH Zurich; Grolltex; imec; Specifically: where new policy, systems and Murata; NanoScale Systems; business processes are needed Proteus Digital Health • CIOs and CTOs in the food to support serialization, unique and beverage, consumer, identifiers and safety systems. Differential Privacy and retail industries should They also must determine early Analysis By: Van Baker evaluate using these sensors on whether the sensors are for tracking product quality considered part of software as a Definition: Differential privacy is “use by dates,” locations, medical device, companion device a system for sharing information unique identifiers and and/or digital therapeutics. about a dataset while withholding performance (such as or distorting certain information pH, oxidation, taste and Business Impact: These sensors elements about individuals in degradation) of fruits, meats, can add data to augment the the dataset. The system uses an 31 exact mathematical algorithm protected information. It can be • Enterprises operating in high- that randomly inserts noise into applied to any information that performance environments the data and ensures that the is associated with personally and also requiring a high-level resulting analysis of the data identifiable attributes or is of precision in their models does not significantly change defined as sensitive information are encouraged to compare whether the individual’s data is under a data protection this approach with other included or not. regulation. The increasing privacy preserving compute sophistication of attacks against platforms. Position and Adoption Speed data repositories will make Justification: As sensitive the use of differential privacy • Differential privacy can be electronic data is stored, systems and other methods such applied at the data level or inadvertent exposure of personal as data encryption increasingly the group level and should be data via analytics is a risk. necessary for organizations deployed appropriately. Additionally, hackers may gain holding personally identifiable access to more databases information. Organizations • Differential privacy can holding individual information looking to monetize their data be used in isolation but that is potentially damaging if assets containing personally is best used with other revealed or used against them. identifiable information will find methodologies to best protect This increasingly puts enterprises themselves under increasing individual information and the at risk of legal liability if they scrutiny. A failure to employ enterprise. don’t protect this data. One differential privacy and other defense against this is the use data protection mechanisms will Business Impact: Businesses of differential privacy systems. likely increase the organizations’ increasingly recognize the This effectively delivers the exposure to legal proceedings and value in data such as customer same analytic results whether potentially damaging financial information in CRM databases individual’s data is included in penalties: however they also find the dataset or not. Differential themselves liable for protection privacy systems use probabilistic • Enterprises holding sensitive of this sensitive personal randomization of the data data assets with personally data. Regulations that define elements in the dataset to make identifiable information how enterprises may use this it impossible for malicious should explore the use of information are increasingly actors to reverse engineer differential privacy systems to being defined and implemented those data attributes and tie decrease the likelihood they and the liability for leaking such them to a specific individual. will expose sensitive data that information can be substantial. While not specifically designed can be tied to individuals. In addition, the reputation to prevent reidentification of the business and the trust attacks, differential privacy does • Enterprises should also take associated with the business effectively protect against these other measures to protect can be significantly damaged by attacks. Differential privacy does data assets containing breaches of sensitive information. have a weakness if the algorithm sensitive data that contains This will require businesses to is repeatedly applied to the data. personally identifiable use whatever means available information. to protect datasets containing User Advice: Differential privacy sensitive information. This systems should be employed • Organizations should not exposure is not limited to when datasets have significant assume that differential the datasets in control of the value to be extracted but the privacy systems alone are business as malicious actors information contains sensitive enough to prevent breach of can increasingly combine data individual information or legally sensitive information. sources to reidentify individuals

32 even if the data used by the on their public infrastructure. • CSPs offering Private 5G to business is anonymized. As There is another class of use cases industrial buyers need to work such, businesses should employ however, not focused on mobility with IT service providers that whatever means available to but requiring a high performance have the required industry protect personally identifiable backbone where wiring is complex skills (and knowledge of other information from being exposed. and costly — as in a factory technologies) to provide a deployment. In that case, support value proposition of how 5G Benefit Rating: High of autonomous delivery vehicles supports a specific use case on the shop floor, could apply, but or business KPI to justify Market Penetration: 1% to 5% of as a complementary application. the additional investments target audience Some verticals may adopt 5G required to make existing sooner as a response to the current infrastructure 5G ready. For Maturity: Emerging COVID-19 pandemic, driven by example, in manufacturing cost optimization, resiliency and 5G is seen as a platform that Sample Vendors: 01Booster; automation concerns. combines connectivity with Hazy; LeapYear security and AI capabilities. Volkswagen is reported to plan Private 5G 5G private deployments in 122 Business Impact: While 5G Analysis By: Sylvain Fabre; Joe German factories in 2020. standard are defined by 3GPP, Skorupa other bodies are contributing in An early implementations order to improve applicability to Definition: Private 5G is defined example is: BMW Brilliance connected industrial applications, as a private mobile network Automotive (BBA, BMW’s JV for example 5G-ACIA (Alliance (PMN) based on 3GPP 5G to in China) claims complete 5G for Connected Industries and interconnect people/things in coverage in all of its factories. Automation). 5G PMN can offer an enterprise. CSPs and TSPs enterprises improved security have the potential to offer 5G User Advice: Enterprises looking and independence and enable PMN to various verticals, such to deploy 5G PMN should: efficiency gains in several as Industry 4.0, mining, oil, manufacturing and industrial utility and rail road companies; • Seek out quotations not only processes. IoT service providers, university from CSPs but also other campuses, stadiums/arenas etc. possible providers such as An early implementations Private 5G offerings provide a large equipment vendors and example is: BMW Brilliance separate network from the public smaller specialist Automotive (BBA, BMW’s JV network. It can include voice, in China) claims complete 5G video, messaging, and broadband • Consider SIs and coverage in all of its factories. data and IoT/M2M use cases with consultancies for design, higher performance requirements. deployment and managed Benefit Rating: High services. Position and Adoption Speed Market Penetration: Less than Justification: Most enterprises • Consider licensed and also 1% of target audience use cases that justify a need to unlicensed/shared spectrum deploy a Private Mobile Network options where available (for Maturity: Emerging (PMN) that requires cellular example 5G was approved connectivity for mobility, will for use in CBRS in February Sample Vendors: Athonet; be adequately served with a 4G 2020 such as CBRS; German Ericsson; Huawei; Nokia network. Where 5G is justified, regulator Bundesnetzagentur a PMN can provide the required (BNetzA) has allocated Small Data functionality earlier than what the spectrum in the 3.7 GHz to Analysis By: Jim Hare; Pieter den local CSPs may have deployed 3.8GHz band for industrial Hamer local 5G). 33 Definition: The concept of “small emerged as a major challenge, • Transfer Learning — Enables data” indicates both the issue even more so with organizations AI solutions to learn from and approach on how to train AI becoming dependent on AI to run a related task where there models when small amounts of their businesses, also in times of is ample data available and training data are enough or there disruption. then uses this knowledge to is insufficient or sparse training help overcome the small data data. There are a variety of There is a growing number of problem. For example, an AI strategies and data augmentation data science innovations and solution learns to find damaged techniques to overcome the open source projects focused on parts from 1,000 pictures problem such as simulation, different data augmentation or collected from a variety of synthetic data, transfer learning, other techniques. Among others, products and data sources. federated learning, self- graph techniques have garnered It can then transfer this supervised learning, few-shot new attention because of the knowledge to detect damaged learning and knowledge graphs. ability to find patterns in small parts in a new product using data, or to reduce dimensionality, just a few pictures. Position and Adoption Speed complementing machine learning. Justification: Supervised deep Several new AI startups have • Federated Learning — Enables learning that started the current created platforms and solutions collaborative ML by sharing AI hype is already fulfilling its that operate on small datasets. local model improvements promise, but it needs a lot of at a central level, where the labeled data. Unlike consumer User Advice: Data and analytics central model combines internet companies, which have leaders whose teams are locally trained or retrained data from billions of users experiencing data scarcity issues models on small data in a to train AI models, collecting in exploring new AI use cases, decentralized environment. massive training sets in most building hypotheses, or handling For example, when a hospital enterprise is often not feasible. production models that have lost wants to develop a model Also, most data science teams their accuracy should consider for treating a condition, but are not in a position to develop this approach first: has limited data, it trains the and train complex supervised model on its own local data. models from scratch due to • Simpler Models — Replacing It then passes this model to resource limitations. Moreover, more complex models with the next hospital that keeps reducing the need for simpler, classical ML models training the model on its own and the ability to use small such as linear regression, data and so on, combining data, results in AI solutions that support vector machines, the model improvements. It are more resilient and agile to K-nearest neighbors, and also increases data privacy handle changes. For example, naïve Bayes that can be as no local data needs to be the COVID-19 virus has resulted trained on small amounts shared centrally. in many production AI models of data. Proper feature across different industry verticals engineering, the use of • Synthetic Data — Used to to lose accuracy because they simpler models or ensembles generate data to meet very were trained using big data thereof, should be in the specific needs or conditions that reflected how the world toolbox of any data scientist, that are not available in worked before the pandemic especially in the case of small existing authentic data. Can hit. Retraining models using the data. be useful when either privacy same approach was not feasible, needs limit the availability or because more recent data of If replacing existing models with usage of the data or when the just a few weeks old are too simpler models is not feasible, data needed to train a model limited to reflect the patterns of consider these emerging data does not exist. the new market circumstances. augmentation and modeling As a result, data scarcity has approaches: 34 • Self-supervised Learning sets. It can also speed up the • User feedback or closed loop — A relatively recent ML business exploration and model data about the quality of the technique where the training prototyping for novel solutions, as ML output, e.g., prediction data is autonomously (or this approach reduces the time, errors, is required to enable automatically) labelled. The compute power, energy and costs reinforcement learning datasets are labelled by finding to collect, prepare or label large for updating the model and exploiting the correlations datasets. parameters while online. between different input signals. Production models Benefit Rating: High • Less frequent model updates can continuously be learning can already be achieved in production making self- Market Penetration: Less than by the current approach supervised learning well suited 1% of target audience of offline retraining, using for changing environments. the full set of available Maturity: Emerging training data, and periodic • Few-shot Learning — model update deployments. Classifies new data having Sample Vendors: Diveplane; With adaptive ML there is seen only a few training Google (Cloud AI); Landing AI; no time to fully retrain the examples. This forces the MyDataModels; OWKIN model. Instead, the model AI to learn to spot the must be incrementally most important patterns Adaptive ML retrained online, using only since it only has a small Analysis By: Pieter den Hamer; new or most recent data, dataset. Useful when training Erick Brethenoux which requires incremental examples are hard to find or learning algorithms that are where the cost of labelling Definition: Adaptive machine different from offline learning data is high. learning (ML) is the capability of algorithms that typically frequently retraining ML models rely on large batches of • Other approaches include when online in their runtime (historical) data. the sharing of scarce data environment, rather than only between organizations, training ML models when offline • Adaptive ML must be tuned together building a larger set, in their development environment. in terms of weighting new and the use of reinforcement This capability allows ML data versus older data that learning, where data is applications to adapt more was used for earlier online gathered through simulations quickly to changing or new real- or offline training and other or experimentation. world circumstances that were challenges such as preventing not foreseen or available during overfitting and proper testing Business Impact: Small data development. and validation, at least techniques enable organizations periodically. to manage production models Position and Adoption Speed that are more resilient and able Justification: Adaptive ML gets AI • Nontechnical challenges to adapt to major world events much closer to self-learning, or at include ethical, societal, like the pandemic or future least to more frequent learning, reliability, liability, safety disruptions. These techniques are compared with most current and security concerns that ideal for AI problems where there AI applications which only use come with self-learning and are no big datasets available. static ML models that depend autonomous systems. Using smaller amounts data on infrequent redeployment of allows data scientists to use new model updates to improve User Advice: Organizations more classical machine learning themselves. Adaptive ML, also should consider the use of algorithms that provide good- known as continuous learning, is adaptive ML for one or more of enough accuracy but without technically challenging for several the following reasons: the need for big data training reasons, including: 35 • The ever increasing • Adaptive ML can be used the COVID-19 pandemic has complexity, pace and to compensate for limited resulted in significant changes dynamics in the environment, availability of training data in market circumstances, society and business, or ‘small data’, hindering requiring adaptation of existing require ML models that offline (e.g., supervised or ML models to maintain their frequently adapt to changing reinforcement) learning accuracy. Adaptive ML is most circumstances and impactful during development. Adaptive relevant in areas in which context events. This is most relevant ML may start out with a and conditions or in which the in real-time application areas minimal viable model that behavior or preferences of actors like continuous intelligence, was pretrained offline, with change frequently. Example streaming analytics, decision the model then incrementally application areas include automation and augmentation improved during the actual customer churning in highly in a myriad of industries and online usage. competitive markets, gaming, business areas. organized crime fighting and • Adaptive ML must be anti-terrorism, fraud detection, • Adaptive ML is a key enabler accompanied by model cyber security, quality monitoring of autonomous systems such monitoring for accuracy in manufacturing, virtual personal as self-driving vehicles or and relevancy and also by assistants, semi(autonomous) smart robots that should be proper risk analysis and risk cars and smart robotics. able to operate in their ever mitigation activities, if only to changing contexts. frequently monitor the quality Benefit Rating: Transformational and reliability of adaptive • With adaptive ML, models ML applications. Even with Market Penetration: Less than remain accurate longer and adaptive ML, a periodic 1% of target audience suffer less from model drift. offline full retraining of the Data science teams can model may be required, as Maturity: Emerging improve their productivity incremental learning has its by leveraging adaptive limitations. Sample Vendors: Cogitai; Guavus; ML to reduce the need IBM; Microsoft; Tazi for conventional model • Organizations should actively monitoring, retraining and manage talent, infrastructure Composite AI redeployment. This will and enabling technology that Analysis By: Pieter den Hamer; reduce the time needed for is specifically required for Erick Brethenoux ModelOps/MLOps. adaptive ML. For example, adaptive ML is likely to be Definition: Composite AI refers In addition: more demanding in terms of to the combined application of compute power in runtime different AI techniques to improve • Adaptive ML should be environments and will require the efficiency of learning, to considered by organizations not the development of knowledge increase the level of “common to replace but to complement about new (incremental sense” and ultimately to much current ML. Most adaptive learning) algorithms and tools. more efficiently solve a wider ML applications will start range of business problems. out with a model that was Business Impact: The main first trained offline. Adaptive impact of adaptive ML is to Position and Adoption Speed ML can be seen as a way to respond more quickly and Justification: Composite AI further improve, maintain, effectively to change, enabling is currently mostly about contextualize, personalize or more autonomous systems that combining “connectionist” AI fine-tune the quality of ML are responsive to the dynamics approaches like deep learning, models, once online. of both gradual change and with “symbolic” and other AI massive disruptions. For example, approaches like rule-based 36 reasoning, graph analysis, agent- business rules, knowledge it helps to expand the scope based modeling or optimization graphs or physical models and quality of AI applications, techniques. Composite AI aims to in conjunction with machine in the sense that more types synergize these approaches, both learning models. of reasoning challenges and from a pragmatic engineering required intelligence can be perspective (improving the • Combine the power of deep embedded in composite AI. effectiveness of AI) and from learning in data science, Other benefits, depending on the a more profound scientific image recognition or natural techniques applied, include better perspective (progressing our language processing with interpretability and the support knowledge about artificial graph analytics to add higher- of augmented intelligence. There intelligence). The ideas behind level, symbolic and relational are many possible examples: composite AI are not new, but are intelligence (for example, only recently truly materializing. spatiotemporal, conceptual or • A heuristic or rule approach The goal is to enable AI solutions common sense reasoning). can work together with a that require less data and energy deep learning network in AI to learn and which embody more • Extend the skills of data for predictive maintenance. “common sense,” thus bringing scientists and machine Rules, coming from human AI closer to human learning learning experts, or recruit/ engineering experts, or the and intelligence. In addition, upskill additional AI experts, application of physical/ composite AI recognizes that to also cover graph analytics, engineering model analysis neither deep learning nor graph optimization or other required may specify that certain sensor analytics or more “classical” techniques for composite readings are likely to indicate AI techniques are silver bullets. AI. In the case of rules inefficient asset operations, Each approach has its strengths and heuristics, skills for which then can be used as and weaknesses; none is able to knowledge elicitation and a feature to train a neural resolve all possible AI challenges. knowledge engineering should network to assess and predict also be available. the asset health. Typically, such User Advice: AI leaders and a combination is much more practitioners should: • Since composite AI is still effective than relying only on emerging, be cautious of heuristics or only on a fully • Identify projects in which the fact that the benefits data-driven approach. a fully data-driven, ML- of composite AI can only only approach is unviable, be achieved through the • In computer vision, (deep) inefficient or ill-fitted. For creative artisanship of AI neural networks are used to example, this is the case when experts, while avoiding identify or categorize people not enough data is available, the disadvantages and or objects in an image. when training a deep learning weaknesses of each This output can then be network requires large underlying AI technique. used to enrich or generate amounts of data, time and a graph, which represents energy, or when the required Business Impact: Composite the image entities and their type of intelligence is very AI offers two main benefits in (spatiotemporal) relationships. hard to represent in current the short term. First, it brings This enables answering artificial neural networks. the power of AI to a broader questions like “which object is group of organizations that in front of another,” “what is • Leverage domain knowledge do not have access to large the speed of an object” and and human expertise to amounts of historical or labeled so on. Using a connectionist provide context to and data but do possess significant approach only, such seemingly complement data-driven human expertise. composite simple questions are extremely insights, by applying AI is one of the strategies to hard to answer. decision management with deal with “small data.” Second, 37 • In supply chain management, produce novel content (images, Prominent organizations, such a composite AI solution can video, music, speech, text — even as Partnership on AI and DARPA, be composed of multiple in combination), improve or alter are pursuing detection of “deep agents, with each agent existing content and create new fakes” to counteract fraud, representing an actor in the data elements. disinformation, instigation of ecosystem, typically having social unrest and other negative its own intelligence to monitor Position and Adoption Speed impacts of generative AI. In 2020, local conditions and machine Justification: The hype around “deep fakes” are not yet pervasive learning to make predictions. generative AI is heating up due among the fake content and Combining these agents into a to its sensational successes news spread across the web, but “swarm” enables the creation and huge societal concerns. Gartner expects this to rapidly of a common situation According to Adweek, patent change in the next five years. awareness, more global filings for generative AI have planning optimization and grown 500% in 2019. Christie’s User Advice: Data and analytics more dynamic, responsive auction house already sells AI- leaders should evaluate generative scheduling. generated artwork. More practical AI for the following purposes: applications, like differential In the longer term, composite AI privacy and synthetic data, are • Creative AI, a large has the potential to pave the way increasingly drawing enterprises’ subcategory of generative AI for more generic and intelligent AI attention. to produce art and work that solutions with profound impact on typically requires imagination, business models, although still a AI methods that directly for example, Adobe Sensei far cry from the elusive artificial extract numeric or categorical for visual arts and OpenAI general intelligence. insights from data are relatively Jukebox for music. widespread. Generative AI, which Benefit Rating: Transformational creates original artifacts or • Content creation, such as reconstructed content and data, text, images, video and Market Penetration: 1% to 5% of is the next frontier. So far, it is sound. Content creation target audience less ubiquitous and with fewer already penetrates marketing, use cases. The hype around for example, producing Maturity: Emerging Generative AI is growing due personalized copywriting. to a recent notable progress Twenty-nine percent of Sample Vendors: ACTICO; of Generative Adversarial marketing leaders rank Beyond Limits; BlackSwan Networks (GANs), invented in generative content creation Technologies; Cognite; 2014, and language generating among the top three, according Exponential AI; FICO; IBM; Indico; models, such as Bidirectional to the 2019 Gartner Marketing Petuum; ReactiveCore Encoder Representations from Technology Survey. Transformers (BERT), introduced Generative AI in 2018, and Generative Pre- • Content improvements, such Analysis By: Svetlana Sicular; trained Transformer 2 (GPT- as rewriting the outdated text, Avivah Litan; Brian Burke 2) introduced in 2019. Other background noise cancelation, quickly progressing generative AI increasing image resolution, Definition: Generative AI is a methods include self-supervised and modifying photos by variety of ML methods that learn learning, variational autoencoders adjusting, removing or adding a representation of artifacts from and autoregressive models. artifacts. the data, and use it to generate brand-new, completely original, Regrettably, generative AI • Data creation, often known realistic artifacts that preserve a technologies underpin “deep as synthetic data, to mitigate likeness to the training data, but fakes,” content that is dangerous data scarcity or privacy do not repeat it. Generative AI can in politics, business and society. barriers to insight. Generative 38 techniques create new data Business Impact: More use well-defined business capability, instances, so the generated cases will surface and proliferate. recognizable as such by a data repeats patterns of The field of generative AI business user. They inherit the actual data, but is will progress rapidly, both some characteristics from both completely made up. For scientific discovery and microservices (encapsulation example, text generation for technology commercialization. and domain-driven design) and chatbots, image generation Reproducibility of AI results will monolithic applications (self- for quality analysis in be challenging in the near term. contained and deliver clear and manufacturing, differential Other technologies, especially complete business value), but privacy. Visma generated for those that provide trust and are more business-oriented the Norwegian Labour and transparency, could become an than former and more adaptive Welfare Administration the important complement to the than latter. Complete vendor entire population of generative AI solutions. applications may be delivered as preserving demographic assemblies of PBCs. nuances. Full and accurate detection of generated content will remain Position and Adoption Speed • Industry applications in retail, challenging for years and may Justification: PBCs are a healthcare, life sciences, not be completely possible. foundational technology resource telecommunications, media, To do so will require elevating of the composable enterprise. education and HCM. For critical thinking as a discipline They act as the building blocks example, in healthcare, in the organization. Technical, for rapid composition and generative AI could create institutional and political recomposition of application medical images that depict interventions combined will be experiences. And when combined the future development of a necessary to fight deep fakes. We with the democratized application disease. In consumer goods, will see unusual collaborations, composition tools, empower it can generate catalogs. even among competitors, to solve application innovation by multi- In e-commerce, it can help the problem of deep fakes and disciplinary fusion teams, IT customers to “try-on” various other ethical issues rooted in professionals and business makeups and outfits. generative capabilities of AI. technologists. Fully-expressed PBCs encapsulate a business • Gartner recommends that Benefit Rating: Transformational entity (e.g., a bank account) software companies that and are exclusive owners of the produce generative AI include Market Penetration: Less than entity’s data. They provide the methods to preclude their 1% of target audience complete set of APIs and event software from being used channels to facilitate the entity’s to generate fake content Maturity: Emerging entire life cycle (e.g., open, close, before releasing the software, deposit, withdrawal, lookup delivering the antidote Sample Vendors: Adobe (Sensei); and all other applicable bank immediately in version 1.0. Bitext; Dessa; Google (DeepMind); account actions). Basic PBCs Landing AI; LeapYear; OpenAI; may represent a single atomic Organizations must prepare to Phrasee; Spectrm; Textio business function (e.g., bank mitigate the impact of deep account deposit), therefore having fakes, which can cause serious Packaged Business limited autonomy. Data and disinformation and reputational Capabilities analytics PBCs deliver reference risk. There are several methods Analysis By: Yefim Natis information and researched evolving to do this including insights, respectively. algorithmic detection and tracing Definition: Packaged content provenance. business capabilities (PBCs) The full fruition of the are encapsulated software composable enterprise model components that represent a comes when both PBCs and 39 democratized composition User Advice: Application leaders, role of partner and source of tools become widely available. in collaboration with CIOs, strategic guidance, support, Today, there are already multiple responsible for strategic business service and some software precursors to both PBCs and change in their organizations development for the business- composition tools, supporting should: led technology innovation. partial implementation of composable enterprise. Visionary • Prioritize mastery in API Business Impact: Adoption of application vendors, sensing management, integration, PBCs enables operation of the customers’ demand for greater business-IT collaboration composable enterprise, which in self-expression in application and democratized tooling turn delivers resilience, efficiency, experiences, are evolving through to achieve preparedness agility and democratization API catalogs to PBC renditions for operating a composable to business. But even alone, of their application services. enterprise experience. without the other key components Today’s PBC precursors include of the future of applications API-centric (“headless”) SaaS • Reject any new monolithic (fusion teams and democratized (e.g., Twilio), API Products and solutions proposed by vendors technology), transition from marketplaces (e.g., RapidAPI), or in-house developers, and the constraints of monolithic banking services (e.g., Solaris) plan to renovate or replace applications or fragmentation of or API aggregators (e.g., Plaid), the old ones to begin to move technical APIs to the granularity prebuilt integrations (e.g., to composable application of business-defined composable Cloud Elements), Business experiences. components advances the ability “microservices” (e.g., Finastra of organizations to innovate APIs) and business APIs (e.g., • Accelerate product-style faster, safer and smarter. SAP Business API Hub). The delivery of application composition platform precursors capabilities packaged as Benefit Rating: High include the low-code application building blocks for application platforms (e.g., Mendix), business assembly, using agile and Market Penetration: 1% to 5% of process management suites (e.g., DevOps techniques over target audience Appian) and integration PaaS traditional methods. (e.g., Dell Boomi). Maturity: Emerging • Build a technology portfolio As the COVID-19 pandemic of democratized tool Sample Vendors: commercetools; disruption forces organizations capabilities in support of Contentful; Elastic Path; finreach to increase their resilience, development, integration/ solutions; Finastra; Mambu; many turn to the model of assembly and governance Plaid; SAP; ; Twilio composable enterprise to drive of composed application agility, efficiency, scalability experiences. Citizen Twin and democratization into their Analysis By: Alfonso Velosa; application environment. To • Give preference to visionary Marty Resnick progress in that direction, application vendors that organizations prioritize business- anticipate the architecture of Definition: A digital twin of a modularity of vendor applications composable enterprise and citizen is a virtual representation and begin to manage their deliver applications, ready for of an individual. Governments API and low-code resources as customers’ subset/superset use citizen twins to support new strategic investments and with recompositions. or enhanced citizen services or that are pushing the notion of the government missions such as PBCs toward the Peak of Inflated • Transform the culture of the pandemic or safety management. Expectations. IT organization from its nearly The citizen twin has model, data, exclusive focus on strategic a unique one-to-one association, software development to the and monitorability. It integrates 40 data into the twin from siloed • Transparently develop robust • There will be increased public and commercial sources privacy and digital ethics regulation to balance the such as health records, social policies government use of the data media, phone location logs, and with the citizens’ respective physical infrastructure such as • Establish clear benefits to rights to privacy. cameras and wearables. citizens such as certifying children in a classroom are all • As governments work to Position and Adoption Speed healthy or simplifying medical collect more data on citizens, Justification: Governments are triage to get a citizen to this may drive a dialogue to increasingly developing digital medical care. get more services and other twin models of citizens to financial benefits in return monitor and help address health, • Develop sensor and IoT to citizens, but it will expose safety, travel, membership, and monitoring capability. a lack of integration skills social media impacts on society. across data sources — and The citizen twin can be used to • Invest in integration skills to political infighting over data build profiles, personas, and connect into a diverse set of siloes. scores helping stakeholders data sources. make decisions, such as aligning Benefit Rating: High medical treatment, managing • Use AI to build and& test the transportation resources, usefulness of a variety of Market Penetration: Less than or taking sensor data to try citizen-twin-based scores. 1% of target audience to understand the health of passengers arriving on an Business Impact: Governments’ Maturity: Embryonic airplane. Aggregated versions safety initiatives will increasingly of the anonymized citizen twin aggregate citizen data across Sample Vendors: ; will be used to understand the world, as they seek to serve Apple; Google; ; VANTIQ broader societal patterns, drive citizens, to protect them from government resource allocation pandemics or other crises. This Digital Twin of the Person and utilization, and impact will have a range of key impacts, Analysis By: Marty Resnick; societal behavior. including: Alfonso Velosa

Precursors already exist. In • Increased debates over Definition: A digital twin of the western countries, financial privacy and the merits of person (DToP) not only mirrors organizations provide citizens government access to citizen a unique individual but is also with credit rating scores. Retailers data, although this has been a near-real-time synchronized, model shoppers. China has a difficult due to politization in multipresence of the individual citizen . a variety of western countries. in both digital and physical A variety of airport and retail spaces. This digital instantiation vendors are developing passenger • Expect scope creep as (or multiple instantiations) of a and shopper tracking solutions. government bureaucracies physical individual continuously increase the types and intertwines, updates, mediates, User Advice: CIOs need to help quantity of data collection. influences, and represents the their governments or enterprises person in multiple scenarios, take advantage of this emerging • Government curation of experiences, circumstances, and trend to serve citizens and aggregated citizen data a personas. customers better. At the same security risk for government time, CIOs must protect their data and possibly a safety risk Position and Adoption Speed citizens, governments, and for the individual citizen. Justification: A simple DToP is enterprises from miss-use of already being used for medical citizen data. Key steps include: and biotech use cases. For 41 example, analyzing healthcare • Gaming in a much more efficient way. New plans, preventative care, ways to serve citizens, patients, wellness and disease control Lower fidelity (high level of data, or shoppers will be enabled by uses a rudimentary DToP to low visualization): real time understanding of their predict future medical costs. situation. In parallel, enterprises Furthermore, the Citizen Digital • Medical with poor security and digital Twin (a “social” subset of a ethics policies expose themselves DToP) is being used to help • Safety to significant legal and regulatory address health, safety, travel, risk. membership, and social media • Healthcare impacts on society. The impact For some enterprises, the critical of DToPs will continue to grow in • Consumer 360 link will be the connection areas such as education, remote between an asset and a person. working, consumer shopping, • Human resources The digital twin of the asset (e.g., gaming and social media. a smart meter) will be connected Enterprises should begin to adopt with the digital twin of the person User Advice: The “avatar” has the concept of DToP to facilitate (e.g., a residential consumer) often been considered a digital more collaborative and engaging and may drive opportunities for representation of someone in remote working situations, serving the customer while driving various situations, however, the understanding and predicting cost and process optimization avatar is just a visualization or customer demands, and and new revenue. digital rendition of the person accelerate new business models and is not typically synchronized reliant on digital representations Benefit Rating: Transformational to the physical person it is linked of people. Enterprises must to. But what really makes a DToP develop strong digital ethics, Market Penetration: Less than different is the role of the twin security, and data governance 1% of target audience as a near-real-time proxy for the policies to protect customer, state or characterization of the employee and citizen privacy and Maturity: Embryonic physical twin, and the various data, while meeting legal and levels of data fidelity that make other compliance requirements. Sample Vendors: Amazon; Apple; this representation effective to Insight Enterprises; NTT; Philips achieve particular outcome. Business Impact: Digital twin Healthcare; ScaleOut Software; Outcomes range from monitoring of the person opens up new Sim&Cure; Tencent a potentially hazardous declining and emerging business models health condition, for aberrant but also opens up the door for Multiexperience social behavior, or for safety in additional security, privacy and Analysis By: Jason Wong hazardous working conditions. ethical considerations. Currently, precursors or early versions of Definition: Multiexperience High fidelity situations (high digital twin of the person are describes the interactions across level of data, high visualization) used for medical, e-commerce a variety of digital touchpoints would include the ability to be and social monitoring. But as (e.g., web, mobile apps, chatbots, represented in the following the concept expands, new citizen AR/VR, wearables), using a situations: services, medical care and sales combination of interaction options will bring in a flood of modalities (e.g., no-touch, voice, • Social experience experimentation by governments vision, gesture) in support of and commercial entities. Effective seamless and consistent digital • Business meeting data-driven decision making and user journeys. Multiexperience testing out of various scenarios is part of a long-term shift • Consumer shopping will be possible with less risk and from computers as individual

42 devices we use to a multidevice, Don’t expect automatic plug and Business Impact: Organizations multisensory and multilocation play of off-the-shelf devices, are shifting their delivery models environment we experience. applications and services. from projects to products, Instead, proprietary ecosystems but beyond products is the Position and Adoption Speed of devices will exist in the near experience — the collection of Justification: Through 2030, the term. Focus on understanding feelings, emotions and memories. user experience (UX) will undergo how unified digital experiences Understanding and exploiting a significant shift in terms of how impact the business and multiexperience is essential to users experience the digital world. use evolving multiexperience the effectiveness of customer web and mobile apps are already technologies to create targeted experience (CX), employee commonplace, but they are solutions for customers or experience (EX) and UX strategies. undergoing UX changes driven by internal constituencies. Multiexperience starts with a new capabilities like progressive mindset to remove friction and web apps, WebXR and AI services. User Advice: Application leaders effort for the users — internal or Conversational platforms allow should: external — through the contextual people to interact more naturally use of digital technologies. and effortlessly with the digital • Identify three to five Adopting this mentality will allow world. Virtual reality (VR), high-value proof-of- application leaders to better align augmented reality (AR) and mixed concept projects in which with business objectives and be reality (MR) are changing the multiexperience design can more agile at delivering positive way people perceive the digital lead to more compelling and business outcomes. When CX, EX, world. This combined shift in both transformative experiences. UX and MX strategies are executed perception and interaction models with one another in harmony and leads to the future multisensory, • Use personas and journey synchronicity, you can deliver multidevice and multitouchpoint mapping to address the transformative and memorable experience. Having the ability to requirements of diverse experiences for customers, communicate with users across enterprise use cases, employees and all users of your many human senses will provide a including external-facing and digital products and services. richer environment for delivering internal-facing scenarios nuanced information. to support a unified digital Benefit Rating: Transformational experience. The long-term manifestation Market Penetration: 1% to 5% of of multiexperience (MX) is a • Collaborate with marketing/ target audience unified digital experience that branding to educate the UX is seamless, collaborative, team on the brand strategy Maturity: Emerging consistent, personalized and and identity; ensure UX ambient. This will happen over the teams accurately apply Responsible AI next five years — and is already visual, behavioral and written Analysis By: Svetlana Sicular accelerated by the COVID-19 guidelines across all relevant pandemic, which has increased multiexperience touchpoints Definition: Responsible AI is an reliance in digital touchpoints. and modalities. umbrella term for many aspects Privacy concerns in particular, of making the right business and may dampen the enthusiasm • Establish a multidisciplinary ethical choices when adopting AI and impact of adoption. On the core team potentially including that organizations often address technical front, the long life cycles but not limited to IT, business independently. These include of many consumer devices and leadership, HR, facilities business and societal value, risk, the complexity of having many management, UX, experience trust, transparency, fairness, creators developing elements design and product. bias mitigation, explainability, independently, will be enormous accountability, safety, privacy barriers to seamless integration. and regulatory compliance. 43 Responsible AI operationalizes an of public debate regarding • Continuously raise awareness organizational responsibility and appropriate AI interpretation, of AI differences from the practices that ensure positive and transparent data handling familiar concepts. Provide accountable AI development and and clear exit plans for such training and education on exploitation. temporary measures. responsible AI, first to most critical personnel, and then to Position and Adoption Speed User Advice: Data and analytics your entire AI audience. Justification: Responsible leaders, take responsibility — it’s AI signifies the move from not AI, it’s you who are liable for • Have an escalation procedure declarations and principles the results and impacts, either early on in case something to operationalization of AI intended or unintended. Extend goes wrong. accountability at the individual, existing mechanisms, like data organizational and societal and analytics governance and risk • Anticipate human problems levels. While AI governance management to AI to: with AI: Identify enthusiasts is practiced by designated who can help establish groups, responsible AI applies Establish and refine processes ongoing education about to everyone who is involved in for handling AI-related business responsible AI. the AI process. Organizations decisions. are increasing their AI maturity, The biggest problem in AI which requires defined methods Designate, for each use case, a adoption currently is mistrust in and roles that operationalize AI champion accountable for the AI solutions and low confidence in principles. Lately, responsible AI responsible development of AI. AI’s positive impact. Responsible has been elevated to the highest AI helps organizations go beyond organization levels by Accenture, • Establish processes for AI purely technical AI progress to Google, Microsoft, OpenAI, review and validation. Have more successfully balance risk PwC, Government of Canada, everyone in the process and value. With AI maturity, you Government of India, the World defend their decisions in front will learn a lot and will make Economic Forum (WEF) and of their peers and validators. fewer mistakes — remain humble more. Although responsible AI is and keep learning. nascent in industries, pioneers • Provide guidelines to assess include AXA, Bank of America, how much risk is appropriate. Business Impact: Societal State Farm, Telefónica and Telus. impacts of AI are frequently • Ensure that humans are depicted in a distorted way, either COVID-19 pandemic stressed the in the loop to mitigate AI too optimistically or as doom need for responsible AI, when deficiencies. and gloom, while the responsible all governments and the entire AI approach helps get a realistic world were following AI models Build bridges to those view and instills trust. AI, like no of pandemic projections and organizational functions that are other technology, encompasses economies’ reopening. Many vital to AI success, but poorly organizational and societal AI vendors and individual data educated about AI value and dangers that have to be mitigated scientists immediately shifted dangers to: by responsible AI development to solving pandemic problems, and handling: where they had to balance vital • Open a conversation with deliverables and risks associated security, legal and customer • The way AI is developed will with privacy, ethics, abrupt experience functions. encompass the mandatory data changes and unconfirmed awareness and actions facts. Using AI for virus tracking, • Build an AI oversight regarding all aspects of monitoring masks distribution committee of independent, responsible AI. Gartner and social distancing are subjects respected people. predicts, “By 2023, all personnel hired for AI 44 development and training Position and Adoption Speed pilots are instantly able to predict work will have to demonstrate Justification: Application entire lines of code, detect quality expertise in responsible development is part science, problems (such as unsecure development of AI.” engineering and craft. The code) and, even fix them. expanding diversity and • New roles, from an complexity of building software While AI is revolutionizing “high- independent AI validator for the digital business puts a control” software application to chief responsible AI premium on business outcomes development, a similar officer are necessary and and continuous value delivery. metamorphosis is happening are already being created to However, reliance on human in “low-code” application operationalize responsible expertise creates an upper limit development. Several vendors AI at the organizational and on how fast we can design, featured in our Low Code societal levels. create and test new software. Application Platform MQ and Handcrafted software, just Multiexperience Development • Responsible AI paves the like any other handicraft is Platform MQ are aggressively way for new business models inconsistent, which can be a investing in AI-augmented for creation of products, problem in creating mission- capabilities. These capabilities services or channels. It forms critical systems that cannot fail. include machine-learning- new ways of doing business Today’s application development driven recommendations that that will result in significant methods involve slow, repetitive generate next best actions (such shifts in market or industry and mundane tasks that sap at as workflows), AI coaches that dynamics via confirmed developers’ creativity, and drains teach novices and application responsible AI actions and their productivity. Additionally, development virtual assistants. protocols; for example, a it takes a long time for a novice cross-organizational effort to programmer to become a master Yet these technologies, while fight “deep fakes.” engineer, further exacerbating the highly promising, are in their shortage of critical application infancy. We don’t know enough Benefit Rating: High development skills. about their reliability, stability, scalability and generality. Market Penetration: 1% to 5% of AIAD attempts to help resolve We don’t know if and how target audience these issues, by augmenting the customizable the models they development teams’ capabilities generate can be. We don’t Maturity: Emerging by acting as a virtual co- understand their failure modes developer, an expert coach and completely. The impressive AI-Augmented Development quality control inspector. models these technologies Analysis By: Arun Batchu generate are opaque. Will we In 2019, two key AI technologies trust them without transparency? Definition: AI-augmented have tag-teamed to dramatically How do we know they have not development (AIAD) is the use of improve the quality of such been tampered with, without AI technologies such as machine AI-augmented software provenance to prove their learning (ML), natural language development: Deep Learning (a authenticity? How do we know processing (NLP) and similar special type of ML) and NLP. By that the code they generate is technologies to aid application treating millions of lines of high not copyrighted or malicious? development teams in creating quality open source software Indeed, AI researchers are and delivering applications faster, code as data, and leveraging the actively working on improving the more consistently, and with ubiquitous availability of high- technologies to resolve these and higher quality. performance computing power, other issues. AI researchers and startups have demonstrated remarkable AI Despite these challenges, early developer “co-pilots.” These co- adopters could gain significant 45 competitive advantage by and cloud computing, these AI assembled by or for its users embracing these innovations machines will gain capabilities from vendor-provided and custom today. that will transform the software packaged business capabilities as development life cycle in the next the building blocks. User Advice: Application leaders three to five years. We expect responsible to application the technology to pass through Position and Adoption Speed development teams should: three stages. The first and current Justification: The core principles stage is where AI is able to help of the composable enterprise — • Encourage their teams to as an apprentice, suggesting modularity, efficiency, continuous experimenting with these code fragments. The next stage improvement and adaptive tools today and adopt them is where the AI becomes smarter innovation — are familiar to most when there is a good fit. to act like a peer to the developer. organizations. Most organizations The third stage is the lead expert have been investing in improving • Monitor how AI is stage where the AI writes most their operation on each of transforming software of the code with the developer these parameters with some development roles and tweaking as necessary. successes, but lacking a cohesive prepare a learning and experience of a broad change. development plan for your This wave could reach tidal The model of composable team accordingly. proportions or dissipate, like any enterprise brings together these emerging technology wave. You core characteristics and applies • If not already familiar, must plan for it now for failing to in equal manner to managing of encourage your teams to plan might mean planning to fail. business models, organizational learn how machine learning structures, ecosystem strategies, and other AI technologies Benefit Rating: Transformational the ways of work of the work, the challenges that employees, and technology come with them and how to Market Penetration: 1% to 5% of investments. The challenge to mitigate them. target audience achieving consistent benefits of composable enterprise across • Engage with augmentation Maturity: Emerging the organization is not any one tool vendors to improve and particular investment, but the co-develop useful features and Sample Vendors: Codota; Deep essential underlying requirement capabilities. Code; Google; Kite; Mendix; for the pervasive practice of Microsoft; OutSystems; Parasoft “composable enterprise thinking.” Business Impact: Unlike previous This, fundamentally cultural, AI technologies that were brittle Composable Enterprise change — from the rigidity of the and static, today’s AI technologies Analysis By: Yefim Natis; Dennis familiar enterprise structures to are general purpose technologies Gaughan; Gene Alvarez the elasticity of active continuous (GPT) and adaptive. Adaptive change — is the most significant GPTs are transformative, just like Definition: A composable barrier to achieving the benefits steam and electric technologies enterprise designs its business of composable enterprise. were in their era. Unlike steam models, technology architecture, and electric technologies, today’s organization and partnership The sudden disruption of the AI technologies increase in their ecosystems in a modular manner, COVID-19 pandemic has woken capabilities proportional to the so that it can safely and rapidly up the leadership of every amount of data and computing change (recompose) at any business to the existentially capacity available to them. moment of need. Composable critical importance of business Propelled by the rapid growth enterprise imposes a model of resilience. In this context, of software code, the data application design that imagines business leaders and technology generated by digital applications applications as experiences vendors all are prepared to make

46 strategic and radical changes the active agility of composable data volume, the increasingly to their operations, practices, experience. Such organization complexity of data integration policies and cultural postures assembles (integrates) its and the rising demand for real- to become better prepared application experiences from time insights. Simply put, a data for the new and next business internal and external ecosystems fabric is a design that leverages disruptions. This strategic of components (packaged existing tools and platforms imperative builds a momentum business capabilities) — to and adds metadata sharing, for steady but fast adoption of empower their organization metadata analysis and metadata- the core principles of composable to actively track and support enabled self-healing along with enterprise, pushing it toward the the specific (and changing) orchestration and administration Peak of Inflated Expectations and requirements of its users. tools to manage the environment. on to the Plateau of Productivity. As a data fabric becomes Benefit Rating: Transformational increasingly dynamic, it evolves User Advice: Application leaders, to support automated data guiding their organizations in the Market Penetration: 1% to 5% of integration delivery. Data fabrics process of digital transformation, target audience are almost at the Peak of Inflated should: Expectations due to the hype Maturity: Emerging in the market and the inherent • Use composable enterprise confusion on how to deliver these. thinking to innovate faster At the Peak A data fabric is not in itself a tool/ and safer, to reduce costs, Data Fabric platform that can be purchased and to lay the foundation for Analysis By: Ehtisham Zaidi; — it is a design concept that business-IT partnerships. Robert Thanaraj; Mark Beyer requires a combination of tools, processes and skill sets to deliver. • Prioritize formation of Definition: A data fabric is an Yet, we witness various tools being business-IT fusion teams to emerging data management developed and sold under the data facilitate faster, smarter and design concept for attaining fabric tag which do not provision safer decisions in navigating flexible, reusable and augmented all the requirements needed to the business through current data integration pipelines, fulfill a data fabric. Not least and future disruptions. services and semantics, in the ability to integrate existing support of various operational data integration technologies • Assemble a democratized and analytics use cases delivered together to deliver a dynamic data technology platform to best across multiple deployment and integration design that uses active support the operation of orchestration platforms. Data metadata to auto-adjust to new fusion teams by combining fabrics support a combination of use-case requirements. low-code composition/ different data integration styles development tools with the and utilize active metadata, Data fabrics will, at the very traditional code-centric knowledge graphs, semantics and least, need to collect all integration/development ML to augment data integration forms of metadata (not just technology. design and delivery. technical metadata) and then perform machine learning Business Impact: Organizations Position and Adoption Speed over this metadata to provide that adopt the model of Justification: The data fabric recommendations for integration composable enterprise in their — as a data management design and delivery. This business, technology and culture design concept — is a direct capability is typically achieved achieve a new level of resilience response to long-standing through the augmented data and a transformative access to issues now being aggravated by catalog capabilities of a data innovation. They move from the digital transformation. These fabric. Advanced data fabrics rigid and inefficient traditional include the multiplicity of data have the capability to assist with normal of hierarchical thinking, to sources and types, the soaring graph data modeling capabilities 47 (which is useful to preserve the combination of ETL with data • Data fabrics also support context of the data along with virtualization). enhanced metadata its complex relationships), and analysis to support data allow the business to enrich • Establish a technology contextualization by adding the models with agreed upon base for the data fabric and semantic standards for semantics. Some data fabrics identify the core capabilities context and meaning come embedded with capabilities required before making (through knowledge graph to create knowledge graphs further purchases. Start implementations). This of linked data and use ML by evaluating your current enables business users to algorithms to provide actionable tools (such as data catalogs, be more involved in the data recommendations and insights data integration, data modeling process and allows to developers and consumers of virtualization, semantic them to enrich models with data. Finally, data fabrics provide technology and DBMSs) agreed upon semantics. capabilities to deliver integrated to identify the existing or data through flexible data delivery missing capabilities. • Over time, the graph develops styles such as data virtualization as more data assets are and/or a combination of APIs • Invest in data management added and can be accessed and microservices (and not just vendors which exhibit a strong by developers and delivered ETL). These are capabilities that roadmap on augmented to various applications together make up a data fabric capabilities, i.e., embedded as needed. This allows and will mature over time as more ML algorithms that can organizations to integrate vendors move away from point-to- utilize metadata and provide data once and share multiple point and static data integration actionable recommendations times thereby improving designs and adopt more dynamic to inform and automate parts the productivity of data data fabrics. of data integration design and engineering teams. delivery. User Advice: Data and analytics • Data fabrics provide improved leaders looking to modernize Business Impact: By leveraging decisions for when to move their data management solutions the data fabric design, data and data or access it in place. must: analytics leaders can establish They also provide the much a more scalable data integration sought-after capability to • Invest in augmented data infrastructure that can provide convert self-service data catalogs. These will help immediate business impact and preparation views into you to inventory all types of enable new use cases, such as: operationalized views that metadata — along with their need physical data movement associated relationships — in • Data fabrics provide a much and consolidation for a flexible data model. Enrich needed productivity boost repeatable and optimized the model through semantics to data engineering teams access (in a data store such and ontologies that make that are struggling with as a data warehouse, for it easier for the business to tactical, mundane and often example). understand the model and redundant tasks of creating contribute to it. data pipelines. Data fabrics Benefit Rating: Transformational once enabled will assist • Combine different data data engineering teams by Market Penetration: 1% to 5% of integration styles to providing insights on data target audience incorporate a portfolio- integration design and will based approach into the even automate repeatable Maturity: Emerging data integration strategy (for transforms and tasks so that example, not just ETL, but a data engineers can focus on Sample Vendors: Cambridge more strategic initiatives. Semantics; Cinchy; CluedIn; 48 data.world; Denodo; Informatica; inference performance. Renesas real-time response and control. Semantic Web Company Electronics has introduced a MPU As the market is at an early stage (PoolParty); Stardog; Talend with an embedded Dynamically of adoption, IT leaders must: Reconfigurable Processor (DRP), Embedded AI a programmable on-chip logic • Determine where (endpoint, Analysis By: Amy Teng; Alan block that can be reconfigured edge or cloud) is best to Priestley via firmware updates. This execute AI based data enables the processor to be analytics. Definition: Embedded AI easily updated with the latest AI refers to the use of AI/ML algorithms. NXP has a general- • Identify the subset of techniques within embedded purpose MCU with heterogeneous applications in your OT systems to enable analysis cores (ARM Cortex M33 and system or product portfolios of locally captured data. This Cadence Tensilica HiFi 4 DSP) that can be meaningfully requirement is particularly targeting audio/video analytics impacted using embedded AI. critical for electronic equipment applications. where decision latency must • Evaluate the availability be minimized for operational ARM’s Cortex-M55 is the first of reference designs that efficiency and safety. It can also Armv8.1-M based MCU core are close to your target enable always-on use cases with Helium vector extensions application, chip vendors, targeting battery-operated devices focusing at DSP/ML compute their solutions and design requiring low-power operation. capabilities, and Ethos-U55 partners. Focus on their — the first micro-NPU that ability of translating and Position and Adoption Speed will co-work with Cortex-M by optimizing your trained model Justification: There is an providing configurable MCAs and into local systems. increasing demand for embedded weight compressions. These two systems to analyze and interpret technologies together with ARM’s • Evaluate the process the data they capture by software development frameworks of updating algorithms leveraging AI/ML locally. enable partners and developers — ensure no security to quickly expand to embedded vulnerability is created due to Virtually all major MCU vendors AI/ML applications by reusing changing designs. have expanded their toolchain current assets and experiences. to include compilers, model Semiconductor vendors are Business Impact: Embedded conversion tools, libraries and integrating these hardware IP AI enables devices to analyze application samples (such as block into their product lineups, captured data using AI/ML object and gesture recognition) to and more products are expected techniques locally, reducing enable embedded AI. Additionally, to be available for market the need to transfer data to a the emergence of Tiny machine adoption from 2021. remote data center for analysis. learning (tinyML) has encouraged This can reduce latency and many new lightweight ML In addition to the forth mentioned enhance operational efficiency. algorithms. In February 2020, vendor activities, we expect Companies who own, sell or serve Apple acquired an AI star- the market continue to vibrant IoT and industrial electronics, up, Xnor.ai, focusing on BNN throughout the year, as a result ranging from OT machines, (Binarized Neural Network), which we updated its position toward to factory equipment, IoT sensors is a type of tinyML. the peak of HC. to consumer electronics, will be positively impacted depending on Vendors are also enhancing the User Advice: Adoption of inclusion of and the value created AI capabilities of their embedded embedded AI requires a clear by AI. processors by integrating workflow and vendor support hardware logic blocks into on tools, especially where the Initial justification will come from chips to optimize and advance embedded system is used for business cases focusing on first- 49 order operational savings, e.g., transformation: the adoption perimeter will transform the predictive maintenance — these of cloud-based services by competitive landscape for are the easiest and clearest to distributed and mobile workforces; network and network security define. As adoption picks up, edge computing and business as a service over the next Gartner expects to see additional continuity plans that must include decade, although the winners value created through dynamic flexible, anywhere, anytime, and losers will be apparent by and real-time optimization of secure remote access. While the 2022. True SASE services are manufacturing lines to incoming term originated in 2019, the cloud-native — dynamically orders and workloads, intelligent architecture has been deployed scalable, globally accessible, buildings that optimize employee by early adopters as early as typically microservices-based productivity. 2017. By 2024, at least 40% and multitenant. The breadth of enterprises will have explicit of services required to fulfill the Benefit Rating: High strategies to adopt SASE, up from broad use cases means very few less than 1% at year-end 2018. vendors will offer a complete Market Penetration: 1% to 5% of solution in 2020, although many target audience By 2023, 20% of enterprises will already deliver a broad set of have adopted SWG, CASB, ZTNA capabilities. Multiple incumbent Maturity: Emerging and branch FWaaS capabilities networking and network security from the same vendor, up from vendors are developing new or Sample Vendors: Arm; less than 5% in 2019. However, enhancing existing cloud-delivery- Cartesiam; NXP Semiconductors; today most implementations based capabilities. One Tech; Renesas Electronics; involve two vendors (SD-WAN + STMicroelectronics Network Security), although single User Advice: There have been vendor solutions are appearing. more than a dozen SASE Secure Access Service Edge Dual-vendor deployments announcements over the past (SASE) that have deep cross-vendor 12 months by vendors seeking Analysis By: Joe Skorupa; Neil integration are highly functional to stake out their position in this MacDonald and largely eliminate the need extremely competitive market. to deploy anything more than a There will be a great deal of Definition: Secure access service L4 stateful firewall in the branch slideware and marketecture, edge (SASE, pronounced “sassy”) office. This will drive a new wave especially from incumbents that delivers multiple capabilities such of consolidation as vendors are ill-prepared for the cloud- as SD-WAN, SWG, CASB, NGFW struggle to invest to compete based delivery as a service model and zero trust network access in this highly disruptive, rapidly and the investments required for (ZTNA). evolving landscape. distributed PoPs. This is a case where software architecture and SASE supports branch office and SASE is in the early stages implementation matters remote worker access. SASE is of market development but delivered as a service, and based is being actively marketed When evaluating SASE offering, upon the identity of the device/ and developed by the vendor be sure to: entity, combined with real-time community. Although the term is context and security/compliance relatively new, the architectural • Involve your CISO and lead policies. Identities can be approach (cloud if you can, on- network architect when associated with people, devices, premises if you must) has been evaluating offerings and IoT or edge computing locations. deployed for at least two years. roadmaps from incumbent The inversion of networking and emerging vendors as Position and Adoption Speed and network security patterns SASE cuts across traditional Justification: SASE is driven as users, devices and services technology boundaries. by enterprise digital business leave the traditional enterprise

50 • Leverage a WAN refresh, Expect resistance from team Definition: Social distancing firewall refresh, VPN refresh members that are wedded to technologies help to encourage or SD-WAN deployment to appliance-based deployments. individuals to maintain a safe drive the redesign of your distance from each other. network and network security Business Impact: SASE will Some of these technologies architectures. enable I&O and security teams and solutions also provide to deliver the rich set of secure contact tracing capabilities • Strive for not more than two networking and security services if an individual is discovered vendors to deliver all core in a consistent and integrated to be infected. They can services. manner to support the needs of be implemented in many digital business transformation, ways, including an app on a • Use cost-cutting initiatives edge computing and workforce smartphone, as a feature of in 2020 from MPLS offload mobility. This will enable new a location tracking system, a to fund branch office and digital business use cases (such dedicated wearable device or workforce transformation via as digital ecosystem and mobile using observational tools such as adoption of SASE. workforce enablement) with video analytics. increased ease of use, while at • Understand what capabilities the same time reducing costs Position and Adoption Speed you require in terms of and complexity via vendor Justification: Social distancing networking and security, consolidation and dedicated technologies have emerged including latency, throughput, circuit offload. as tactical solutions to help geographic coverage and organizations and individuals deal endpoint types. COVID-19 has highlighted the with the COVID-19 pandemic. need for business continuity plans Many of these technologies use • Combine branch office and that include flexible, anywhere, wireless systems for proximity secure remote access in a anytime, secure remote access, detection, but in principle, any single implementation, even if at scale, even from untrusted technology that can measure the transition will occur over devices. SASE’s cloud-delivered location or proximity can be used an extended period. set of services, including zero to support social distancing. All trust network access, is driving such systems are imperfect, and • Avoid vendors that propose rapid adoption of SASE. face challenges such as accuracy, to deliver the broad set of reliability, user acceptance, services by linking a large Benefit Rating: Transformational privacy concerns and, in the number of products via case of smartphone solutions, virtual machine service Market Penetration: 1% to 5% of the challenges of supporting chaining. target audience an app on a very wide range of consumer devices. However, • Prioritize use cases where Maturity: Emerging despite these challenges, we SASE drives measurable expect them to be a useful tactic business value. Mobile Sample Vendors: Akamai; Cato to reduce risk in the pandemic. workforce, contractor Networks; Cisco; Citrix; iboss; As most such systems are based access and edge computing Netskope; Open Systems; Palo on modifications of existing applications that are latency Alto Networks; VMware; Zscaler technologies, we expect rapid sensitive are three likely maturity — within two years. opportunities. Social Distancing Technologies User Advice: Organizations that Some buyers will implement a Analysis By: Leif-Olof Wallin; need to manage risk as staff well-integrated dual vendor best- Nick Jones return to work after the pandemic of-breed strategy while others will should consider social distancing select a single vendor approach. technologies because, despite 51 their limitations, any form of risk the technologies’ limitations, we Benefit Rating: High reduction is better than none. expect many organizations will Industrial, construction and blue feel that some support for social Market Penetration: Less than collar workers who may not carry distancing is better than none, 1% of target audience smartphones in their normal and additionally some may find working environment may benefit their lawyers recommend them to Maturity: Emerging from dedicated proximity-warning reduce potential liability. devices, or equipment such as Sample Vendors: AiRISTA Flow; smart hard hats that have been Business Impact: It’s easier Apple; Estimote; Fujitsu America; modified to track proximity. Staff to apply social distancing Google; Kiana; Radiant RFID; in office-based environments technologies in situations where Samsung Electronics; Sonitor may benefit from app-based the organization, sometimes Technologies; Zebra solutions. Organizations with in cooperation with a union, comprehensive endpoint can influence individuals and Explainable AI management in place will be the equipment they use, e.g., Analysis By: Saniye Alaybeyi best equipped to rapidly deploy by providing smart badges these tools onto users’ devices or standard smartphones. Definition: AI researchers define with minimal friction, as they Situations include factories, “explainable AI” as an ensemble typically have UEM technologies warehouses and some offices. of methods that make black-box and a well-defined hardware Effective application of social AI algorithms’ outputs sufficiently base. Most organizations will use distancing technologies is much understandable. Gartner’s social distancing technologies more difficult when dealing with definition of explainable AI is in conjunction with processes a wide range of individuals in broader — a set of capabilities such as reducing the number the general population, e.g., that describes a model, highlights of employees in offices and customers at retail outlets or in its strengths and weaknesses, establishing behavior and visual showrooms, or visitors to venues predicts its likely behavior, and guidelines. Some app-based such as museums. Challenges in identifies any potential biases. solutions may be superseded or the latter area include privacy, It can articulate the decisions augmented by national social convincing individuals to adopt a of a descriptive, predictive or distancing app initiatives, or solution, and supporting apps on prescriptive model to enable apps from megavendors such as a wide and uncontrolled range of accuracy, fairness, accountability, Google and Apple. smartphones. Social distancing stability and transparency in technology will be one part of algorithmic decision making. Social distancing technologies a multidimensional strategy cannot provide a guarantee that will include tactics such Position and Adoption Speed against infection, so organizations as behavioral guidelines, new Justification: Not every should set realistic expectations working practices and controlling decision an AI model makes for the effectiveness of such the number of visitors to venues. needs to be explained. There tools. All are likely to generate Some of these solutions can be is still considerable amount false negatives and positives. It’s used for additional use cases, of discussion in sectors such likely that app-based systems will like hand-washing compliance. as insurance or banking, in be less accurate than dedicated In some situations, investment which there are sometimes wearables. Those deploying in social distancing technology company level or even legislative the technology should also be can also be part of a mitigation restrictions that make it a transparent about what personal strategy against future litigation must for the models that these data is stored, collected and for not taking proper care of companies use to be explainable. retained by such systems and employees and customers. In 2020 more vendors introduced how it will be used for tasks like improved explainable AI contact tracing. However, despite capabilities that can help data

52 scientists create an audit trail • Create data and algorithm Benefit Rating: High that starts from data collection policy review boards to track to model development and and perform periodic reviews Market Penetration: 1% to 5% of deployment. In 2020, explainable of machine learning algorithms target audience AI was less hyped compared and data being used. Continue to 2019, and Gartner saw real to explain AI outputs within Maturity: Emerging and useful implementations of changing security requirements, explainable AI. Therefore, we privacy needs, ethical values, Sample Vendors: H2O.ai; IBM; decided to move explainable AI societal expectations and Microsoft; simMachines from prepeak 25% to postpeak cultural norms. 5% on the Hype Cycle. Sliding Into the Trough Business Impact: End-user Carbon-Based Transistors User Advice: organizations may be able to Analysis By: Gaurav Gupta utilize some future interpretability • Foster ongoing conversations capabilities from vendors to be Definition: Carbon-based with various line-of-business able to explain their AI outputs. transistors replace silicon in leaders, including legal But eventually, AI explainability traditional transistors and and compliance, to gain is the end-user organization’s offer an alternative solution for an understanding of the responsibility. End users know performance benefit as Si-based AI model’s interpretability the business context their transistors reach practical limits. requirements, challenges organizations operate in, so they There are two examples of and opportunities from each are better-positioned to explain C-based transistors; graphene business unit. Integrate these their AI’s decisions and outputs and carbon nanotubes. Graphene findings into the development in human-understandable ways. is a one-atom thick material of of the enterprise information The need for explainable AI has pure carbon, bonded together in management strategy. implications for how IT leaders a hexagonal honeycomb lattice. A operate, such as consulting with carbon nanotube can be thought • Build partnerships with IT, the line of business, asking the of as a sheet of graphene rolled in particular with application right questions specific to the into a cylinder. The rolling- leaders, to explain how the AI business domain, and identifying up direction of the graphene model fits within the overall transparency requirements for layers determines the electrical design and operation of the data sources and algorithms. The properties of the nanotubes. business solution, and to give overarching goal is that models stakeholders visibility into need to conform to regulatory Position and Adoption Speed training data. requirements and take into Justification: Graphene is a account any issues or constraints hard material to create, as • Start with using AI to that the line of business has arranging carbon atoms in a augment rather than replace highlighted. New policies two-dimensional hexagonal lattice human decision making. around the inputs and boundary at a fairly large scale is difficult. Having humans make the conditions on the inputs into the Material quality can drastically ultimate decision avoids some AI subsystem, how anomalies are decrease with just one defect. complexity of explainable handled, how models are trained Graphene field-effect transistors AI. Data biases may still be and the frequency of training (GFETs) take the typical FET questioned, but human-based need to be incorporated into AI device and insert a graphene decisions are likely to be more governance frameworks. Many channel tens of microns in size difficult to be challenged than questions about the suitability of between the source and drain. machine-only decisions. the AI model will rely on a clear Graphene transistors have high understanding of the goals of the device sensitivity and superior application(s) being designed. conductivity. Another issue is lack of band gap in graphene that 53 makes it very hard to turn the the Trough of Disillusionment (Facebook, VK, WeChat, etc.) or a current off once it starts flowing, as they are past their peak of higher-assurance identity (such as a major roadblock for logic expectation, and now researchers a bank identity, or a government operations, which require on-off and industry experts are facing eID) to assert their identity in switching. Researchers have been reality. Target audiences that will order to access multiple digital working to find solutions to this require these semiconductors services. Service providers problem, but compounded with must continue to work on can be enabled to trust these a lack of fully integrated supply fabrication at scale to resolve external digital IDs for purposes chain, graphene is far from issues with mass production. of authentication and access commercial application. Additionally, alternative next- to digital services, but also for generation transistor solutions sharing of identity attributes such Carbon nanotubes (CNTs) with are evolving that can challenge as name and address. semiconductor properties offer their position. the promise of small transistors Position and Adoption Speed with high switching speeds in Business Impact: There is Justification: BYOI consists future semiconductor devices, potential for a huge impact, of several mechanisms and while CNTs with metallic particularly when silicon devices technologies, each with their (conducting) properties hold reach their minimum size own level of adoption and the promise of low electrical limits — expected during the maturity. Social identities are resistance that can be applied next five to 10 years. Wireless well established and have been to the interconnections within communications is an area where the most commonly used type of integrated circuits. Research these technologies will be really digital ID with BYOI; however, this indicates carbon nanotube FETs beneficial due to high current mechanism comes with issues have properties that promise carrying capability in a small regarding privacy as the use of around 10 times the energy area. An example of current social identities leaves a digital efficiency and far greater speeds commercial application includes “bread crumb” (log of activity) compared to silicon. CNTs Nantero’s NRAM, which leverages with social media providers, and can be single or multiwalled carbon nanotube technology. have a relatively low assurance depending on the number of of identity, as many social media graphene layers and as a result Benefit Rating: High providers do not perform identity have different strength and proofing when establishing user efficiency. Currently, there are Market Penetration: Less than credentials. mixed opinions whether or not 1% of target audience CNT transistors would maintain The year 2019-2020 has seen their impressive performance at Maturity: Emerging much progress in comparison to extremely scaled lengths. But earlier years. The EU electronic when fabricated at scale, the Sample Vendors: Fujitsu; IDentification, Authentication and transistors often come with many Graphenea; imec; IBM; Intel; trust Services (eIDAS) regulation defects that affect performance, Nano-C; Samsung Electronics; established minimum identity so they remain impractical. TSMC assurance requirements and Currently there is no technology mandated interoperability by 28 for their mass fabrication and Bring Your Own Identity September 2018. Organizations high production cost. Analysis By: David Mahdi; Felix delivering public digital services Gaehtgens in an EU member state must now User Advice: Semiconductor recognize electronic identification opportunity will be available Definition: Bring your own from all EU member states, and for next-generation transistors identity (BYOI) is the concept of EU Trusted Service Providers beyond 5 nm. C-based transistors allowing users to select and use can enable users to sign have been moved forward in their an external (third-party) digital legal documents using digital position on the Hype Cycle toward identity, such as a social identity signatures. In Canada, SecureKey 54 launched Verified.Me, in addition • Loss of customers: Carefully own identity proofing, as can to the already established determine how the friction relying on high assurance Concierge service. WeChat was of using legacy approaches idPs to perform appropriate chosen to deliver an electronic reduces customer experience risk assessment and MFA at eID in Guangzhou, China in late (CX) and thus customer authentication, lowering the 2018, with expansion to other retention. BYOI can alleviate barrier to new business models provinces in 2019. this. that require higher levels of identity assurance. This can cause Financial institutions also • Honeypot for identity some transformational impacts forged ahead with interoperable credentials and personal on certain industries, especially digital identities. In the Nordics, information: Mitigate risks in the era of digital business. partnerships between the here by enabling customer government and financial and other users to rely Many organizations have made institutions are already on BYOI. However, by not significant investment in their established. Capital One’s Identity considering BYOI, full IAM approach and to retain Services was launched in 2017 responsibility for identity and the customer, and therefore and expanded, including through credential exposure remains have established themselves as acquisitions of technology. with the enterprise. custodians of digital identity. Mastercard announced a However, only a small number of consumer-centric model for Focus on reducing friction by these organizations will likely be digital identity in 2019. Other leveraging common BYOI uses able to monetize their existing tech titans such as Apple such as account registration and client basis by becoming a third- announced and launched “Sign login. Creating a great CX can party IdP. Key decision points in With Apple” in 2019, which offset risks of diluting the brand that can motivate a move in this leverages Apple digital identities. and the loss of ownership of the direction include: customer journey. Furthermore, new and innovative • Monetization of identity approaches to decentralize Ensure the level of trust provided attributes identity also known as by the identity provider (IdP) “blockchain identity” and “self- matches the level of risk, or the • Brand loyalty sovereign identity” are spawning identity provider provides trust a lively mix of startups and elevation to bridge any gap. • User demographics industry consortia, and large technology providers are also Determine the overall model • Security and privacy concerns investing in this area. of the approach to consumer access: Will you accept other Benefit Rating: Transformational User Advice: Recognize that methods for BYOI (i.e., accept the proliferation of siloed, third-party identities)? Will you Market Penetration: 5% to 20% noninteroperable digital identities be a third-party IdP, that will offer of target audience will not scale with the needs of identities for consumption by digital business. Determine how other organizations? Maturity: Early mainstream to take value from, or in some cases, contribute to, the BYOI Business Impact: BYOI offers Sample Vendors: Amazon; Apple; landscape. Especially for B2C or the potential to leverage outside Evernym; Facebook; ForgeRock; G2C initiatives, there are some identities to help reduce friction Google; Microsoft; SecureKey; potential risks that can arise from and to increase adoption, security Signicat; Twitter not leveraging BYOI such as: and overall end-user satisfaction. Exploiting higher trust BYOI for Ontologies and Graphs customer registration potentially Analysis By: Anthony Mullen avoids the cost of doing your 55 Definition: Ontologies and custom-made NLT use cases will projects (e.g., chatbots) with an graphs enable users to model spur many end users to develop ontology making them reusable a set of concepts, categories, their own. for other NLT projects. properties and relationships in a particular domain. They User Advice: As NLT proliferates • Represent product catalogues support the development of a in organizations there is and services as an ontology to consistent terminology and allow an inevitable increase in enable richer collaborations for complex relationships to be inconsistencies of terms and between processes and represented including part-whole concepts across business units, partners. relations, causation, material partners and cross industry which constitution, plurality and unity. ultimately hampers systemic • Use them to speed They are often used to abstract improvement. To counter identification when there are away from underlying relational information architecture problems multiple points to triangulate schemas and can be seen as a many end users use ontologies (faster than a relational flexible knowledge network with as a necessary abstraction away database search). broad use across many NLT from service and technology use cases. OWL and RDF are platform relational schemas. • Capture and represent tacit popular standards for ontology and implicit knowledge from definitions. Their use is very broad and employees due to retire. applicable to numerous industries Position and Adoption Speed and problems. Examples of their • Support generation of reports Justification: Today the heavy application include: (e.g., sales, quarterly) using burden of humans alone NLG. managing ontologies is reduced • Product catalogues and by using ML to support their discovery • Hire librarians to complement creation, maintenance and tuning. the data science team to Workflows are also maturing in • Enterprise search manage ontological models. this space to create human-in-the- loop designs supporting human • Marketing collateral • Consider ontology vendors experts and users in the effort to development and their wider offering, develop and maintain them. Many specifically how they relate semantic platforms have pivoted • Content management in ontologies (definitions) to integrate symbolic (e.g., media organizations to graphs (expressions of ontologies) and subsymbolic ontologies as data). approaches (e.g., DNNs) over • Cause and effect modelling in recent years which has improved health populations Vendors should seek to make NLT performance. their ontologies available as an • Representations of digital asset, in a marketplace, rather Ontologies are often a component twins in manufacturing than a hidden mechanic for the of broader hybrid AI systems end users they serve and to use and we see their use across the End users should: them to expand data and service following NLT markets: speech partnerships in the NLT space. to text, insight engines, text • Check to see if any large scale mining, conversational systems ontologies are available for Business Impact: As investment and natural language generation. their industry or within their and dependence on NLT While many end users will existing applications. increase among consumers, indirectly use vendor ontologies, enterprises and vendors, we few develop and maintain their • Master entity, intents and will see large scale ontologies own. However, the proliferation of relationship definitions for NLT become a foundational approach

56 for concept and relationship accessible UIs and machine Market Penetration: 5% to 20% modelling by organizations. learning as part of the workflow. of target audience Ontologies are one major tool Ontologies also represent an against the fragmentation that easy to use bridge to external/ Maturity: Early mainstream multiple NLT projects and vendors linked data allowing organizations bring. An important dimension to improve their analytical and Sample Vendors: Expert System; to this technology is the ease automation capabilities. Ontotext; PoolParty; Smartlogic; with which ontologies can be Synaptica; Taiger; Yactraq generated and maintained and Benefit Rating: High this has improved with both

Appendixes

Figure 3. Hype Cycle for Emerging Technologies, 2019

57 Hype Cycle Phases, Benefit Ratings and Maturity Levels

Table 1. Hype Cycle Phases

Phase Definition Innovation Trigger A breakthrough, public demonstration, product launch or other event generates significant press and industry interest. Peak of Inflated Expectations During this phase of overenthusiasm and unrealistic projections, a flurry of well-publicized activity by technology leaders results in some successes, but more failures, as the technology is pushed to its limits. The only enterprises making money are conference organizers and magazine publishers. Trough of Disillusionment Because the technology does not live up to its overinflated expectations, it rapidly becomes unfashionable. Media interest wanes, except for a few cautionary tales. Slope of Enlightenment Focused experimentation and solid hard work by an increasingly diverse range of organizations lead to a true understanding of the technology’s applicability, risks and benefits. Commercial off-the-shelf methodologies and tools ease the development process. Plateau of Productivity The real-world benefits of the technology are demonstrated and accepted. Tools and methodologies are increasingly stable as they enter their second and third generations. Growing numbers of organizations feel comfortable with the reduced level of risk; the rapid growth phase of adoption begins. Approximately 20% of the technology’s target audience has adopted or is adopting the technology as it enters this phase. Years to Mainstream Adoption The time required for the technology to reach the Plateau of Productivity.

Source: Gartner (July 2020)

Table 2. Benefit Ratings

Benefit Rating Definition Transformational Enables new ways of doing business across industries that will result in major shifts in industry dynamics High Enables new ways of performing horizontal or vertical processes that will result in significantly increased revenue or cost savings for an enterprise Moderate Provides incremental improvements to established processes that will result in increased revenue or cost savings for an enterprise Low Slightly improves processes (for example, improved user experience) that will be difficult to translate into increased revenue or cost savings

Source: Gartner (July 2020)

58 Table 3. Maturity Levels

Maturity Level Status Products/Vendors Embryonic • In labs • None Emerging • Commercialization by vendors • First generation • Pilots and deployments by industry leaders • High price • Much customization Adolescent • Maturing technology capabilities and process • Second generation understanding • Less customization • Uptake beyond early adopters Early mainstream • Proven technology • Third generation • Vendors, technology and adoption rapidly • More out-of-box methodologies evolving Mature • Robust technology • Several dominant vendors mainstream • Not much evolution in vendors or technology Legacy • Not appropriate for new developments • Maintenance revenue focus • Cost of migration constrains replacement Obsolete • Rarely used • Used/resale market only

Source: Gartner (July 2020)

Source: Gartner Research, G00 450415, B. Burke, M. Resnick, A. Gao, 24 July 2020-

59 About Ant Group

Ant Group aims to create the infrastructure and platform to support the digital transformation of the service industry. We strive to enable all consumers and small businesses to have equal access to financial and other services that are inclusive, green and sustainable.

Contact us antgroup.com

The Top 10 Fintech Trends for 2021is published by Ant Group. Editorial content supplied by Ant Group is independent of Gartner analysis. All Gartner research is used with Gartner’s permission, and was originally published as part of Gartner’s syndicated research service available to all entitled Gartner clients. © 2020 Gartner, Inc. and/or its affiliates. All rights reserved. The use of Gartner research in this publication does not indicate Gartner’s endorsement of Ant Group’s products and/or strategies. Reproduction or distribution of this publication in any form without Gartner’s prior written permission is forbidden. The information contained herein has been obtained from sources believed to be reliable. Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information. The opinions expressed herein are subject to change without notice. Although Gartner research may include a discussion of related legal issues, Gartner does not provide legal advice or services and its research should not be construed or used as such. Gartner is a public company, and its shareholders may include firms and funds that have financial interests in entities covered in Gartner research. Gartner’s Board of Directors may include senior managers of these firms or funds. Gartner research is produced independently by its research organization without input or influence from these firms, funds or their managers. For further information on the independence and integrity of Gartner research, see “Guiding Principles on Independence and Objectivity” on its website, http://www.gartner.com/technology/about/ombudsman/omb_guide2.jsp.