博文

目前显示的是 五月, 2023的博文

Science and technology to promote cutting-edge technology, showcase leading achievements, Witmem (ZhiCun) Technology at 2023 Beijing Science and Technology Week

图片
  On May 20, 2023, the opening ceremony of the National Science and Technology Week and Beijing Science and Technology Week, co-hosted by the Ministry of Science and Technology and the Beijing Municipal People's Government, was held in Beijing, officially kicking off the 12-day grand science and technology event. The 29th Beijing Science and Technology Week, held from May 20 to May 31, 2023, had the theme "Love Science, Advocate Science." It aimed to showcase Beijing's scientific and technological strength in innovation and popularize science, presenting a harmonious and sustainable urban life. Witmem (Zhicun) Technology made its appearance at the main venue's "High-Performance Computing Chip" exhibition area, showcasing innovative technological achievements to the visiting delegation, enterprise units, and the general public. They introduced the WTM2101 chip and its applications in the field of Computing in memory. Computing in memory, or "CunSuanY...

Neural Networks Are Revolutionizing the Field of AI

图片
  Neural networks are a powerful subset of artificial intelligence (AI) that have revolutionized the field of machine learning. They are modeled after the structure and function of the human brain, consisting of interconnected nodes or neurons that process and transmit information. Neural networks have been used to solve a wide range of tasks in AI, including image recognition, speech recognition, natural language processing, and even game playing. They are particularly useful for tasks that involve large datasets or complex patterns, as they can learn and improve over time by adjusting the connection strengths between neurons based on the input data and desired output. One of the most exciting applications of neural networks is in the field of image recognition. Image recognition involves analyzing large amounts of data to identify objects and patterns within images. Neural networks have proven to be highly effective at this task, achieving accuracy rates that rival or exceed thos...

The Future of Computing in Memory in China

图片
  The future of computing in memory in China looks promising, as the country has made significant investments in research and development to advance this technology. Computing in memory refers to the ability to perform data processing tasks within computer memory rather than transferring data back and forth between memory and storage devices, resulting in faster and more efficient computing. China has recognized the importance of computing in memory and has invested heavily in research and development to advance this technology. The country's goal is to become a leader in computing in memory and to accelerate the development of China's semiconductor industry. China's investment in computing in memory has already yielded results, with Chinese companies making significant advancements in this area. For example, Yangtze Memory Technologies has developed a new type of NAND flash memory that uses a three-dimensional architecture to increase storage density and improve performanc...

Guideline- WTM2101 Hearing Aids Solution

图片
  Hearing aids and cochlear implants are the current solutions for individuals with severe hearing loss. In addition to the requirements of miniaturization and reduced power consumption, with the advent of the artificial intelligence era, intelligence is becoming a new trend in the development of hearing aids. Witmem(Zhicun) Technology has introduced a chip hearing aid solution, the WTM2101 chip, which incorporates a storage-computing integrated architecture, breaking the limitations of storage walls and possessing the advantages of high computing power and low power consumption. This chip can facilitate product launches and drive the transition into a new era of intelligent hearing aids. The WTM2101 chip, with its storage-computing integrated architecture, boasts an AI computing power of up to 50 Gops operations per second (Gops). Compared to existing wearable device chips, it offers tens to hundreds of times improvement in AI computing power. ZhiCun Technology's storage...

ZhiCun Lecture & Peking University | AI big models bring changes to Computility Base

图片
  On the evening of April 28, 2023, the ninth "ZhiCun Lecture" of the School of Information Science and Technology, as well as the Frontier of Information Science and Technology and Industrial Innovation Course, was successfully held in Room 106 of the Science Teaching Building. Mr. Wang Shaodi, founder and CEO of ZhiCun Technology and Peking University alumnus, was invited to give a talk on the theme of "The Impact of AI Large Models on Computing Infrastructure". More than 30 teachers and students attended the event. The event was hosted by Professor Wang Runsheng, vice dean of the School of Information Science and Technology at Peking University. At the beginning of the lecture, Mr. Wang Shaodi briefly introduced the current situation and the situation of ZhiCun Technology. He stated that AI large models have reached the singularity and will not only generate huge economic benefits, bring significant changes to people's lives, but also have a great impact on t...

Computing In Memory and Neural Networks Drive AI Development

图片
  Computing in memory and neural networks are two fast-evolving technologies that are transforming the fields of computing and artificial intelligence (AI). Both technologies have the potential to significantly improve the performance and efficiency of computing systems, and they are increasingly being used in a wide range of applications. Computing in memory refers to the ability to perform data processing tasks within computer memory rather than transferring data back and forth between memory and storage devices. This results in faster and more efficient computing, as it reduces the amount of data that needs to be moved around the system. Computing in memory is particularly useful for tasks that involve large datasets, such as machine learning and AI applications. Neural networks, on the other hand, are a type of machine learning algorithm that is modeled after the structure of the human brain. They consist of interconnected nodes called neurons, which process and transmit inform...

Principles and advantages of Computing in Memory

图片
  The formation of the concept of Computing in Memory (CIM) can be traced back to the 1990s. Extracting data from the memory outside the processing unit, the handling time is often hundreds of times the computing time, the useless energy consumption of the whole process is about 60%-90%, the energy efficiency is very low, and the "storage wall" has become a data computing A major obstacle to application. Computing in Memory can be understood as embedding computing power in memory, and performing two-dimensional and three-dimensional matrix multiplication/addition operations with a new computing architecture, rather than optimizing on traditional logical operation units or processes. This can essentially eliminate the delay and power consumption of unnecessary data movement, improve AI computing efficiency hundreds of times, reduce costs, and break the storage wall. Relatively speaking, CPUs generally have 10-100 computing cores, GPUs generally have tens of thousands of comput...