The Japanese semiconductor industry is making a comeback as the demand for chips increases due to the spread of generative artificial intelligence and growing geopolitical risks. Rapidus, a Japanese chipmaker, has begun installing extreme ultraviolet (EUV) lithography equipment at a new plant in Chitose, Hokkaido, with plans to start production this year.
https://www.japantimes.co.jp/business/2025/01/03/tech/japan-chipmakers-stage-comeback/SK hynix, a leading semiconductor supplier based in Korea, is developing innovative products to meet the growing demand for artificial intelligence (AI) technology. The company's CMM-Ax product adds computational functionality to high-capacity memory, improving performance and energy efficiency of next-generation server platforms. Additionally, SK hynix plans to produce 6th generation HBM (HBM4) in the second half of this year to lead the customized HBM market. This move is expected to further accelerate AI-driven changes in the world.
https://www.prnewswire.com/news-releases/sk-hynix-to-unveil-full-stack-ai-memory-provider-vision-at-ces-2025-302341613.htmlNvidia has upgraded its Jetson Orin Nano Super Developer Kit, doubling its memory bandwidth to 102GB/s while retaining the original hardware specifications. This upgrade boosts raw compute performance and generative AI capabilities by 70%, enabling developers to deploy sophisticated AI models on compact devices. The platform, which now costs $249, supports up to four high-resolution cameras and concurrent AI applications, making it ideal for prototyping and edge AI deployments. Nvidia's Ampere architecture GPU and 6-core Arm CPU power the device, which has sparked questions about whether the hardware was capable of this performance from its launch in 2022 or 2023, suggesting that the upgrade may have been waiting to be unlocked through a firmware switch.
https://www.techradar.com/pro/good-news-popular-nvidia-hardware-gets-free-upgrade-that-boosts-its-performance-by-up-to-70-percent-but-its-not-for-gamersGoogle has announced that its sixth-generation Tensor Processing Unit (TPU), called Trillium, is now available for rent after being made available in preview just months prior. The chip offers up to a 2.5x improvement in training performance per dollar compared to previous TPU generations and was used to train Gemini 2.0, Google's advanced AI model. Trillium provides more than four times the training performance of its predecessor, with improved energy efficiency by 67% and peak compute performance per chip increased by a factor of 4.7. The chip is also optimized for embedding-intensive models and forms the foundation of Google Cloud's AI Hypercomputer, which features over 100,000 Trillium chips connected via a Jupiter network fabric delivering 13 Petabits/sec of bandwidth.
https://www.techradar.com/pro/you-can-now-rent-googles-most-powerful-ai-chip-trillium-tpu-underpins-gemini-2-0-and-will-put-amd-and-nvidia-on-high-alertMediaTek has announced the Dimensity 8400, a premium smartphone chip that offers advanced generative artificial intelligence (Gen-AI) capabilities. The chip builds on the company's flagship Dimensity 9400 and features an All Big Core design, paired with a powerful Neural Processing Unit (NPU) to accelerate Gen-AI tasks. This is the first time this technology has been brought to the premium smartphone market.
https://www.manilatimes.net/2024/12/28/tmt-newswire/mediatek-unveils-dimensity-8400/2027817Micron Technology has partnered with Marvell, a leading memory maker, to develop highly customized hardware for cloud operators. This collaboration reflects the industry's shift towards tailored solutions for AI applications. The partnership aims to increase memory capacity and bandwidth, enabling efficient scaling of infrastructure in the AI era. According to Raj Narasimhan, senior vice president at Micron, this customization will provide hyperscalers with a robust platform to deliver optimal performance required to scale AI capabilities.
https://www.techradar.com/pro/usd100bn-tech-company-youve-probably-never-heard-of-is-teaming-up-with-the-worlds-biggest-memory-manufacturers-to-produce-supercharged-hbmResearchers at the University of Kansas and the University of Houston, led by Professor Judy Wu, have developed atomically tunable "memristors" that mimic the human brain's neural network. Funded by the National Science Foundation's Future of Semiconductors program, this innovation aims to create devices for high-speed, energy-efficient processing in artificial intelligence systems. The memristors can store and process information simultaneously, enabling parallel data processing like biological brains. With a $1.8 million grant, the team has achieved sub-2-nanometer thickness in memory devices, 10 times thinner than average, and is working on workforce development to address the growing need for skilled professionals in the semiconductor industry.
https://www.techradar.com/pro/from-lab-to-life-atomic-scale-memristors-pave-the-way-for-brain-like-ai-and-next-gen-computing-power