🟦 Apple’s new “MacBook Pro” compatible with generated AI announced

Apple uses MacBook Pro AI with "M4" smoothly
Apple、M4 ProとM4 Maxを発表
Appleは、M4とともに、さらに電力効率に優れたパフォーマンスと先進的な機能をMacにもたらす、M4 ProとM4 Maxの2つの新しいチップを発表しました。

🟦 Apple-generated AI-powered new MacBook Pro announced—the industry’s fastest performance with the M4 chip

Apple has unveiled the new MacBook Pro, which features the latest M4 series of generative AI-enabled chips, significantly improving professional performance and power efficiency. The high-performance M4 chip is a step to strengthen the competitiveness of Apple products, especially for the holiday shopping season.

🟦 Performance of the latest M4 chip series

Apple has announced the new “M4 Pro” and “M4 Max” chips that will be installed in the MacBook Pro. The M4 series boasts more than three times the AI processing speed compared to the original M1 chip and is specially designed to smoothly process generative AI large language models (LLMs). In addition, in order to achieve the work efficiency required of professional engineers and creators, power efficiency has been further improved, Thunderbolt 5 support, and unified memory bandwidth has been expanded.

Apple M Series-based devices

  • November 2020: M1 (N5)
    • MacBook Pro, MacBook Air, Mac mini, iPad Pro, iPad Air
  • October 2021: M1 Pro, M1 Max (N5)
    • MacBook Pro, Mac Studio
  • March 2022: M1 Ultra (N5)
    • Mac Studio
  • June 2022: M2 (N5P)
    • MacBook Pro, MacBook Air, Mac mini, iPad Pro, iPad Air, Vision Pro
  • January 2023: M2 Pro, M2 Max (N5P)
    • MacBook Pro, Mac Studio
  • June 2023: M2 Ultra (N5P)
    • Mac Studio
  • November 2023: M3, M3 Pro, M3 Max (N3)
    • MacBook Pro, MacBook Air
  • May 2024: M4 (N3E)
    • iPad Pro, iMac, Mac mini
  • October 2024: M4 ProM4 Max (N3E)
    • MacBook Pro, Mac mini

🟦Summary

Apple is leading the industry in performance and power efficiency with the announcement of the new MacBook Pro with its latest M4-series chip designed in-house and enhanced generative AI-enabled features. The point is that it is optimized for professional use, such as the processing speed of generative AI and support for large language models.

Copied title and URL