Microsoft's AI universe opens: integrating Windows, launching an AI application store, and connecting Bing to ChatGPT

Text: Zhou Xinyu

Editor: Anita Tang

"Two simple letters are all it takes to accurately convey this year's major shift in tech: the A and the I."

From May 23 to May 25, 2023, Microsoft's annual developer conference Build will undertake the development theme of the AI industry for the next year. After teaming up with OpenAI, Microsoft's "AI reengineering" of the whole business has become the vane of the whole industry: the launch of New Bing, a search engine connected to GPT-4, and Copilot, an AI assistant platform for Office, collaboration software Teams and other business applications.

At this year's Build conference, Microsoft once again threw out the king bomb, launching an AI family bucket covering software applications and cloud services-among them, Copilot and more than 50 new plug-ins are the absolute protagonists of the conference.

As a bridge between natural language and programming language, Copilot is considered an important component to assist users in invoking AI capabilities in the AI era. At this year's conference, Copilot extended from Microsoft 365 and Github to Windows and databases.

As the carrier of model capabilities, plug-ins constitute an application ecosystem based on AI models. The opening of Build this year is accompanied by the comprehensive sharing of Bing and ChatGPT plug-in ecology.

In addition, continuing the Build conference in 2022, Microsoft will implement less code/no code more thoroughly-a set of Azure service ecosystem for AI application development is being established.

Copilot fully transforms Windows, users can call App without code

Two months ago, the AI assistant Copilot's access to Office 365 shocked the entire industry, and now Microsoft has added another fire-embedding Copilot into Windows, making Windows the world's first PC system that integrates AI services.

Simply put, "asking questions" can solve everything—users can input commands in the Copilot interface, and Windows Copilot can automatically invoke all applications in the system, which greatly simplifies the process and time of searching, enabling, and cross-application compression.

Use Windows Copilot to invoke the music player software Spotify

Windows is still firmly occupying the dominance of the PC system. Even in January 2022, there are still 1.4 billion monthly active devices running Windows 10 and 11, accounting for more than 70% of the global operating system market.

Therefore, connecting Copilot to Windows instead of using click interaction—that is, calling applications by code means that user interaction methods and software refactoring have officially begun.

Kevin Scott, chief technology officer and vice president of Microsoft AI, said that it is hard to imagine any software that cannot be connected to the Internet. In the future, Copilot will be as important as the Internet.

"Microsoft is giving developers all the tools they need to build Copilot," he said. "In the next few years, Copilot will become standard for all software production."

Currently, Windows Copilot integrates Bing Chat and other first-party and third-party plug-ins. **In June of this year, Windows Copilot Preview will be launched in Windows 11. **

Rebuild Bing on ChatGPT, 50+ plugins have been connected

Since the release of ChatGPT, the biggest AI reform first occurred in search—in February this year, Microsoft announced the launch of Bing and Edge browsers, a new search engine based on ChatGPT.

Now, Microsoft plans to completely break through the ecological barriers between Bing and ChatGPT, so as to build a more open plug-in ecology.

On the one hand, Microsoft announced that New Bing will adopt the same open plug-in standard as ChatGPT, so as to realize the interoperability of plug-ins across ChatGPT and the breadth of Microsoft Copilot products.

During the conversation with ChatGPT, users can query questions in real time, ChatGPT will provide time-sensitive answers, and mark the source link of the answer, and users can visit the Bing web page through the link.

Access ChatGPT of Bing Chat.

For another example, after sharing the plug-in ecosystem of ChatGPT, when users query information in Bing, the corresponding plug-in service can be automatically called. For example, after enabling the plug-in of Zillow, a real estate service provider, users can automatically dial Zillow for further consultation after searching for houses in Chicago in Bing.

Bing integrates the plug-in function of ChatGPT.

Bing is constantly expanding the sequence of supported plugins. At present, Microsoft has provided customers with more than 50 plug-ins, including the previous online ordering platform OpenTable, the computing knowledge engine Wolfram Alpha, etc. Microsoft also announced to everyone that the number of plug-ins supported by Bing will expand to thousands in the future.

On the other hand, Bing will be embedded in ChatGPT as the default search engine.

ChatGPT is currently the fastest growing AI application. This means that a large number of users can access the network through ChatGPT, which solves the timeliness problem of ChatGPT's original database. From May 23, ChatGPT Plus subscribers can directly experience ChatGPT after networking, and ordinary users can experience the same service by enabling the Bing plug-in.

AI application store shelves

With the advent of the AI era, the role of the Microsoft Store has also changed: it is not only a place to download applications or games, but an important provider of AI-native application services - how to help customers improve work efficiency, complete tasks and Discover new content.

To this end, the Microsoft Store has launched two functions: AI-generated keywords and AI-generated review summaries. At the same time, the Microsoft Store launched a dedicated AI application section to provide customers with a series of AI services. For example, customers can use apps like De, Krisp, and Podcastle to generate and edit video and audio, and Kickresume to generate a resume.

AI store in Microsoft Store

Copilot Stack is online, everyone can create Copilot

"You have to remember that the model is not your product, unless you are an infrastructure company itself - just like the model itself is just the foundation to support the product, and it is nothing in itself."

At the meeting, Microsoft CTO Kevin Scott regarded the model as the soil of subversive applications. Different from the starting point of traditional software development, the development platform based on AI capabilities can greatly reduce the threshold for creative implementation.

Copilot Stack

To this end, Microsoft launched the AI development platform Copilot Stack to assist developers in developing personalized Copilot. Copilot Stack simplifies the Copilot development process into three steps:

  1. Select the model base, such as calling GPT-4 through the Azure OpenAI service;

  2. Combining AI capabilities. In this link, the developer needs to provide a "meta-hint (meta-)", which is a basic description of Copilot's functions and methods of operation;

  3. Access to data and other services that plug-ins can provide.

The AI open platform can make Copilot and plug-ins scalable. Developers can expand Copilot's ecology through Copilot Stack. Meanwhile, developers can create new plugins through the Microsoft Teams Toolkit for Visual Studio Code and Visual Studio.

In order to improve development efficiency, Microsoft also launched Dev Home, a development tool platform in Windows 11, to simplify the workflow of developers.

In addition, Microsoft is developing Copilot or corresponding functions that can assist in the development of plug-ins. For example, Visual Studio Code, GitHub Copilot, and GitHub Codespaces can simplify the process for developers to create, debug, and deploy new plug-ins. In the future, Azure AI will also launch the function of running and testing plug-ins on private enterprise data.

Azure AI Studio is online, covering the "full production process" of AI models and applications

At this year's Build conference, Microsoft shouted "On Azure, we have everything we need to develop Copilot"-Azure AI Studio is officially launched.

Azure AI Studio is equivalent to a productivity tool in the AI era - helping users produce AI models and applications. Azure AI Studio has built-in models such as GPT-4/3.5, ChatGPT, and DALL E 2 developed based on OpenAI, where developers can build, train, evaluate, and deploy AI models, and develop AI applications based on private data.

Users can upload private data in Azure AI Studio.

Users train related applications based on private data.

Microsoft points out that Azure AI Studio adheres to the value proposition of complying with the user's organizational policies and access rights, and does not affect the other party's security, data policies, or document sequences. Customers can choose different data deployment methods and access methods (enterprise internal/external data), and Azure AI Studio also supports different data formats.

Microsoft said that more than 4,500 companies have used Azure OpenAI services, including Coursera, Grammarly, Volvo and IKEA.

Microsoft Fabric is launched, a data analysis tool in the AI era

What will the software of the future look like? Microsoft first used the field of data analysis to make a sample.

Microsoft's new Microsoft Fabric, an end-to-end data analysis platform, integrates Microsoft's Power BI, Data Factory and next-generation Synapse software.

In terms of functionality, Microsoft Fabric covers functions such as data engineering, data integration, data warehouse, data science, real-time analysis, application observability, and business analysis. All functions are automatically connected to Fabric's built-in multi-cloud deployment database OneLake, and the database will be automatically indexed for data use. Through functional integration, Microsoft Fabric provides customized services for customers with different technical backgrounds.

On the other hand, Microsoft Fabric also greatly lowers the threshold for using data analysis. With Copilot, users can create data flows, generate code and complete functions, build machine learning models, or visualize results without code through dialogue and simple operations. Users can even call the model in the Azure OpenAI service to develop plug-in applications based on private data.

In fields like data analysis, strong professionalism and decentralized services are common pain points—there are hundreds of applications in the data analysis market in the United States alone, and users who want to do data analysis must purchase many application modules. But with Copilot, Microsoft Fabric is equivalent to unifying many data analysis software functions.

Microsoft CEO Satya Nadella also said that he called the release of Microsoft Fabric "the most important release of Microsoft since the release of Microsoft SQL Server (Microsoft's SQL database product) 7 years ago."

Summary: AI "family bucket", against open source

An obvious trend is that the focus of this year's Microsoft Build conference has gradually shifted from the provision of AI tool-based services last year to the stage of AI tools + AI application ecology.

Only providing computing power to OpenAI is not enough for Microsoft and OpenAI to reach an in-depth cooperation. To start this AI revolution, Microsoft's original enterprise software, enterprise service capabilities and ecology are the key.

In a conversation with Microsoft CTO Kevin Scott, OpenAI co-founder Greg Brockman talked about the reasons for OpenAI's successful cooperation with Microsoft: "We have an end-to-end platform to build AI applications."

The so-called "end-to-end" not only requires enterprises to have a complete application and supporting tool ecosystem, but also requires a strong infrastructure such as the cloud. This also makes Brockman judge that the most interesting Copilot will not be established in the open source community.

In addition, "Open Eco (open ecology)" is a keyword that is constantly mentioned at the meeting. Microsoft's AI "family bucket" is obviously well prepared for this - the plug-in ecology across Bing and ChatGPT is aimed at individual users, while the low-code development ecology based on Azure services and Copilot is aimed at enterprise users.

In just a few months, the revolutionary wave brought about by the AI model has rapidly spread from a single point of technology to the global software ecosystem. Copilot (co-pilot) accelerates, global users need to sit still.

View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
  • Pin