💙 Gate Square #Gate Blue Challenge# 💙
Show your limitless creativity with Gate Blue!
📅 Event Period
August 11 – 20, 2025
🎯 How to Participate
1. Post your original creation (image / video / hand-drawn art / digital work, etc.) on Gate Square, incorporating Gate’s brand blue or the Gate logo.
2. Include the hashtag #Gate Blue Challenge# in your post title or content.
3. Add a short blessing or message for Gate in your content (e.g., “Wishing Gate Exchange continued success — may the blue shine forever!”).
4. Submissions must be original and comply with community guidelines. Plagiarism or re
GPT boarding is only for human-vehicle interaction? Car companies are still holding back big moves
Not long ago, Mercedes-Benz integrated ChatGPT into the car and started a three-month test. The results showed that its voice assistant can not only complete simple instructions, but also carry out multiple rounds of continuous dialogue, comprehension and The response quality has been greatly improved.
Automakers such as Ideal, Skyworth, and Weilai immediately rushed in, using cutting-edge GPT capabilities to make cars and machines more intelligent. The car machine has also completely changed from the original "radio" to a smart terminal with rich functions. After adding the GPT "brain", it began to change from a dull and tasteless machine to a driving partner.
And human-vehicle interaction is not the end of AI in the car, autonomous driving is the future. Previous autonomous driving solutions relied too much on high-precision maps. Once the map update cannot keep up with the ever-changing road conditions, driving safety will be threatened. The evolution and upgrade of the AI model has given car companies an opportunity.
It is becoming a mainstream trend to let AI take the initiative to perceive and make decisions, and give up the dependence on high-precision maps. A few days ago, Li Auto started the urban NOA (navigation assisted driving) internal test. It uses the BEV (Bird's eye view, bird's eye view) large model as the main solution, allowing the car to imitate the human "brain circuit" driving. Through continuous learning, urban NOA can also be trained as a "driver" on the user's commuting route.
Taking over the internet, AI is transforming cars at a deeper level, and the big guys with four wheels are becoming more and more like Transformers.
Car machine + GPT Mercedes-Benz first shot
A "Metamorphosis" from the inside out is sweeping the automotive circle, from traditional fuel power to new energy, from driving tools to intelligent products. Over the years, driven by technology, cars have constantly changed their appearance and interior. After the Internet transformed cars, artificial intelligence came again.
Mercedes-Benz is taking the lead in this new wave, and is going to transplant ChatGPT into the car.
On June 16, Mercedes-Benz's three-month ChatGPT test plan was launched in the United States. It cooperated with Microsoft to integrate ChatGPT into the car through the Azure OpenAI service. Car owners can choose to use ChatGPT through the Mercedes me APP, and there is a more intuitive test method - use the voice command "Hey Mercedes, I want to join the test plan" directly in the car, and Mercedes-Benz's MBUX infotainment system will send the voice assistant "Hey Mercedes "Automatically connect to ChatGPT.
In the past, Hey Mercedes can provide information such as sports and weather, and answer questions about the environment around the vehicle, and can also control the user's smart home, which is all standard. The addition of ChatGPT has made the question and answer flexible. Users can ask for detailed information about the destination, get dinner suggestions, ask questions continuously, and receive replies continuously. This is the housekeeping skill of ChatGPT.
Currently, only about 900,000 Mercedes-Benz vehicles equipped with MBUX in the United States can first test ChatGPT. Mercedes-Benz intends to use this initial testing period to gain a deep understanding of user requests to determine future development priorities and adjust the launch of different markets and languages. Strategy.
Regarding the access to ChatGPT, Mercedes-Benz gave an emotional statement, "All goals revolve around redefining your relationship with Mercedes." Mercedes-Benz wants ChatGPT to reshape the human-vehicle interaction experience, a more vivid analogy Yes, the car machine has "lived" from a dull and functional machine, and has changed to the role of a life partner in the car.
After Mercedes-Benz, domestic automakers took the lead to keep up.
On June 19, Ideal Automobile launched a self-developed cognitive large model "Mind GPT", which was developed by the ideal spatial algorithm team. It is said that the start date of large-scale model training was long before the release of ChatGPT. Based on 10 TB of original training data, Mind GPT used 1.3 trillion Tokens for base model training. It can recognize voiceprints and voice content, and understand dialects , At the same time, it can provide car owners with travel planning, and even has functions such as AI drawing and AI calculation.
Ideal revealed that after the release of Mind GPT, Ideal Auto will add a new LUI (User Language Interface) interaction method. Automatically calculate the travel route."
Skyworth Motors also announced recently that two of its models, Skyworth EV6Ⅱ and Skyworth HT-iⅡ, have integrated ChatGPT on the smart car. In addition, Great Wall Motors, Weilai Automobile, Xiaopeng Motors, and Chery Automobile have all four car companies in the last month. GPT-related trademarks have been applied for.
GPT boarding has become a trend. Zhang Junyi, managing partner of Oliver Wyman Consulting, believes that the access of GPT technology can improve the human-computer interaction ability of the car and the interaction ability of comprehensive environmental issues. In the future, car companies will have smaller and smaller brand differences in hardware at the same price range. When competition in terms of comfort, safety, power, and cruising range is difficult to produce too much difference, volume intelligence will become an inevitable choice.
Equip the smart cockpit with a "brain"
ChatGPT on the car is another remarkable achievement in the history of car evolution. The most cutting-edge natural language processing model is applied to human travel tools, and a richer in-car life experience will appear.
Looking back more than 30 years ago, there was still a blank space for in-vehicle entertainment functions and in-vehicle intelligence. The first generation of car machines was born in the 1980s and 1990s. At that time, people generally focused on the "three major parts" of the car's engine, chassis, and gearbox. Suddenly, some models could not only listen to the radio, but also swallow tapes. Play music freely, and the car has some shadow of a second living space.
The second-generation car machine has added DVD player and MP3. While the entertainment is highlighted, the car has taken another step towards driving experience by adding in-car navigation. At this time, solving the problem of "road blindness" has become a mainstream trend. Many old drivers must remember that in the era when there was no Internet of Vehicles, Kay Rucker car navigation became the standard configuration of high-end models. It uses GPS satellite positioning and map package data stored in the car to achieve relatively accurate navigation accuracy.
But apart from navigation, listening to music, and radio, people at that time did not have too much expectation for the car, and the car was often not the main factor in deciding whether to buy a car.
In the 21st century, with the continuous development of electronic and digital technology, the form of mobile phones has changed first. Following this evolutionary idea, a large screen appears on the car, and intelligence becomes a new selling point. Cars based on systems such as linux, WinCE, and Android have been adopted by car manufacturers one after another. After that, cars can not only provide free real-time navigation, but also have panoramic visualization systems and car driving assistance systems, such as 360-degree images.
When the car is connected to the Internet, everything becomes even more different. Online movie viewing, road book, voice control, scheduled maintenance, remote diagnosis and other functions have been added to the car machine. The screen of the center console is getting bigger and bigger, with more and more functions. Large display screens, and even manufacturers have recently rolled up "full screens", even the co-driver and the rear row must be equipped with screens.
Finally, the concept of the "third screen" is becoming more and more conspicuous. The OEMs hope that the car machine can become the third generation of smart terminals that affect human life after computers and mobile phones. Occupying the minds of users with technology-rich cars and expanding more business models has become the direction of the current car companies.
Now, the original concept of "car machine" is gradually replaced by "smart cockpit". Weilai even coined the new term "second living room". Not only are cars and machines becoming more and more intelligent, but car companies are beginning to roll out interior materials, sound systems, and lighting systems. Weilai also released a pair of AR glasses. , support for watching movies on the giant screen in the car; Ideal L9 is even equipped with a rear refrigerator, making the car a mobile house.
However, whether it is a car machine or a smart cockpit, voice dialogue has always been a relatively lagging function. Considering driving safety, voice control is very necessary.
In the past ten years, almost all car companies and a large number of AI startups have invested a lot in the field of natural language processing, hoping to optimize the voice interaction experience in the car. Many car machines can answer simple preset commands, such as increasing the temperature, forecasting the weather, etc. Upgrades and innovations revolve around broadening natural language passwords. temperature.
But if you want the car to understand more "human words", such as planning routes in various dialects or even finding restaurants, it may not be as effective as the car owner himself using mobile maps and Dianping. The richer voice-based human-vehicle interaction falls into the trap Bottleneck until ChatGPT appears.
The natural language large-scale model products (ChatGPT, Wenxin Yiyan, Tongyi Qianwen, etc.) are directly open to the C-side, so that the developers of the smart cockpit see the dawn. Strong comprehension ability and logical reasoning ability, it is expected that the car machine can become a driving assistant, and hidden business possibilities.
For example, the car owner can tell the voice assistant, "Help me find a hot pot restaurant near the destination that has group buying discounts and a rating of more than 4.5. There will be 5 people dining in a while, reserve a place for me, and then see where parking is convenient." In the past, the car machine absolutely could not understand so much information at one time, but for ChatGPT, this is just its basic operation. As long as there are enough real-time data sources, the possibility of satisfying the demand can be infinite.
The addition of GPT not only makes the dialogue more fluent, but also gives the car machine a "brain", which can not only answer questions, but also understand needs and generate answers. As for how high the IQ is and how fast the reaction is, it depends on the ability of the car manufacturer to train the large model on the car, and whether it dares to "krypton" the better hardware (chip) on the gold.
How does AI make the "brain circuit" of automatic driving more like a human?
The abundance of life in the car makes the car gradually grow into a carrier full of warmth. It is no longer a boring and cold means of transportation, but a comfortable living space.
And the AI-led car evolution is not limited to GPT on the car, it is even more significant for the technological advancement of autonomous driving.
The traditional autonomous driving research method is to collect large-scale driving data and test longer driving mileage to cover all possible driving scenarios, so as to ensure that the car has a preset response plan when an emergency occurs. However, the complexity of emergencies is often unpredictable. Once the system does not have a plan to deal with a special emergency, driving safety will be greatly threatened.
This is why the current assisted driving system must require the driver to hold the steering wheel to deal with real-time emergencies. The learning ability of AI may change this status quo.
Not long ago, the research team of Tsinghua University proposed the "credible continuous evolution" technology of autonomous driving. Starting from the basic active avoidance, it will continue to improve, and achieve better driving performance under the premise of ensuring safety.
Simple understanding, using AI, cars with autonomous driving functions can actively learn and become familiar with various new scenarios encountered, and continue to evolve. With the accumulation of driving mileage and data volume, the performance continues to improve.
Li Auto is using AI large models in the field of autonomous driving. On June 17, Ideal announced the opening of the urban NOA (navigation assisted driving) internal test, and will open the commuting NOA function to users in the second half of the year. Different from conventional solutions, Ideal adopts a large BEV (Bird's eye view) model to perceive and understand road structure information in the environment in real time, so that the car can better imitate the operating habits of human drivers.
In the past, most of the assisted driving systems on cars used high-precision map solutions, which is equivalent to feeding road conditions to the automatic driving system in real time, allowing it to make decisions. However, in complex urban roads, there will always be areas that cannot be covered by high-precision maps and cannot be updated in time, which has become a major shortcoming of the solution. After using the BEV large model, it is equivalent to AI actively sensing real-time road conditions and making independent decisions on driving operations.
Of course, BEV also has disadvantages. For example, at some intersections with large spans, there are many passing vehicles, and the sensor's field of view is easily blocked, resulting in the loss of local information in the real-time perception results of the vehicle. In order to make up for this deficiency, Ideal is said to be equipped with a neural prior network (Neural PriorNet, referred to as NPN) and an end-to-end signal light intent network. Reference; the latter is to learn the response of a large number of human drivers to changes in signal lights at intersections to help the automatic driving system understand traffic signals.
According to the actual test feedback, the ideal city NOA is not yet able to fully realize automatic driving. It has the problems of not turning in time and not being good at overtaking. In addition, in the face of some special obstacles, the algorithm cannot make decisions and must be manually taken over.
However, compared with the traditional training method, the biggest change of the introduction of the large model is that the automatic driving system has a stronger learning ability, which means that the automatic driving ability will gradually improve. A typical case is that Li Auto launched the commuting NOA function. Before turning on this function, the car owner needs to set the commuting route first, and accumulate NPN features through automatic training during daily commuting. After about 1 to 3 weeks, the AI can grow into The "driver" on the commuting section.
This process embodies the idea of self-driving car operation under the blessing of AI large model: first learn and get familiar with the road conditions, and then perform assisted driving, the "brain circuit" is more like a human being.
It is not ideal to use AI large-scale models to develop the originator of autonomous driving, but Tesla. As early as 2021, Tesla launched a BEV perception solution based on the Transformer architecture, and then Huawei, Baidu and other companies also launched layouts on "BEV+Transformer". Implement and continuously optimize the "urban NOA" function.
The continuous evolution of large models is likely to allow car companies to find a breakthrough direction for autonomous driving technology. Getting rid of the dependence on high-precision maps is the first step. The current automatic driving is still in the "assisted driving" stage. In the future, you are likely to hand over yours to AI to take over.