{"id":54261,"date":"2025-11-29T08:19:51","date_gmt":"2025-11-29T08:19:51","guid":{"rendered":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/"},"modified":"2025-11-29T08:19:51","modified_gmt":"2025-11-29T08:19:51","slug":"what-to-be-thankful-for-in-ai-in-2025","status":"publish","type":"post","link":"https:\/\/youzum.net\/fr\/what-to-be-thankful-for-in-ai-in-2025\/","title":{"rendered":"What to be thankful for in AI in 2025"},"content":{"rendered":"<p>Hello, dear readers. Happy belated Thanksgiving and Black Friday!<\/p>\n<p>This year has felt like living inside a permanent DevDay. Every week, some lab drops a new model, a new agent framework, or a new \u201cthis changes everything\u201d demo. It\u2019s overwhelming. But it\u2019s also the first year I\u2019ve felt like AI is finally diversifying \u2014 not just one or two frontier models in the cloud, but a whole ecosystem: open and closed, giant and tiny, Western and Chinese, cloud and local.<\/p>\n<p>So for this Thanksgiving edition, here\u2019s what I\u2019m genuinely thankful for in AI in 2025 \u2014 the releases that feel like they\u2019ll matter in 12\u201324 months, not just during this week\u2019s hype cycle.<\/p>\n<h3><b>1. OpenAI kept shipping strong: GPT-5, GPT-5.1, Atlas, Sora 2 and open weights<\/b><\/h3>\n<p>As the company that undeniably birthed the &#8220;generative AI&#8221; era with its viral hit product ChatGPT in late 2022, OpenAI arguably had among the hardest tasks of any AI company in 2025: continue its growth trajectory even as well-funded competitors like Google with its Gemini models and other startups like Anthropic fielded their own highly competitive offerings. <\/p>\n<p>Thankfully, OpenAI rose to the challenge and then some. Its headline act was GPT-5, unveiled in August as the next frontier reasoning model, followed in <a href=\"https:\/\/venturebeat.com\/ai\/openai-reboots-chatgpt-experience-with-gpt-5-1-after-mixed-reviews-of-gpt-5\">November by GPT-5.1 <\/a>with new Instant and Thinking variants that dynamically adjust how much \u201cthinking time\u201d they spend per task. <\/p>\n<p>In practice, GPT-5\u2019s launch was bumpy \u2014 VentureBeat documented early math and coding failures and a cooler-than-expected community reaction in \u201c<a href=\"https:\/\/venturebeat.com\/ai\/openais-gpt-5-rollout-is-not-going-smoothly\">OpenAI\u2019s GPT-5 rollout is not going smoothly<\/a>,&#8221; but it quickly course corrected based on user feedback and, as a daily user of this model, I&#8217;m personally pleased with it and impressed with it. <\/p>\n<p>At the same time, enterprises actually using the models are reporting solid gains. <a href=\"https:\/\/www.linkedin.com\/company\/zendesk-global\/\">ZenDesk Global<\/a>, for example, <a href=\"https:\/\/venturebeat.com\/ai\/zendesk-reports-30-faster-response-95-reliability-after-gpt-5-integration?utm_source=chatgpt.com\">says GPT-5-powered agents now resolve more than half of customer tickets<\/a>, with some customers seeing 80\u201390% resolution rates. That\u2019s the quiet story: these models may not always impress the chattering classes on X, but they\u2019re starting to move real KPIs.<\/p>\n<p>On the tooling side, OpenAI finally gave developers a serious AI engineer with GPT-5.1-Codex-Max, a new coding model that can run long, agentic workflows and is already the default in OpenAI\u2019s Codex environment. VentureBeat covered it in detail in \u201c<a href=\"https:\/\/venturebeat.com\/ai\/openai-debuts-gpt-5-1-codex-max-coding-model-and-it-already-completed-a-24\">OpenAI debuts GPT-5.1-Codex-Max coding model and it already completed a 24-hour task internally<\/a>.\u201d <\/p>\n<p>Then there\u2019s ChatGPT Atlas, <a href=\"https:\/\/venturebeat.com\/ai\/openai-releases-chatgpt-atlas-an-ai-enabled-web-browser-to-challenge-google\">a full browser with ChatGPT baked into the chrome itself<\/a> \u2014 sidebar summaries, on-page analysis, and search tightly integrated into regular browsing. It\u2019s the clearest sign yet that \u201cassistant\u201d and \u201cbrowser\u201d are on a collision course.<\/p>\n<p>On the media side, Sora 2 turned the original Sora video demo into a full video-and-audio model with better physics, synchronized sound and dialogue, and more control over style and shot structure, plus <a href=\"https:\/\/venturebeat.com\/ai\/openai-debuts-sora-2-ai-video-generator-app-with-sound-and-self-insertion\">a dedicated Sora app<\/a> with a full fledged social networking component, allowing any user to <a href=\"https:\/\/www.linkedin.com\/pulse\/your-own-personalized-tv-network-pocket-reflections-sora-mwz3e\/?trackingId=F0O9AmNbiyBUvYozY3z8Dw%3D%3D\">create their own TV network in their pocket<\/a>. <\/p>\n<p>Finally \u2014 and maybe most symbolically \u2014 <a href=\"https:\/\/venturebeat.com\/ai\/openai-returns-to-open-source-roots-with-new-models-gpt-oss-120b-and-gpt-oss-20b\">OpenAI released gpt-oss-120B and gpt-oss-20B<\/a>, open-weight MoE reasoning models under an Apache 2.0\u2013style license. Whatever you think of their quality (and early open-source users have been loud about their complaints), this is the first time since GPT-2 that OpenAI has put serious weights into the public commons.<\/p>\n<h3><b>2. China\u2019s open-source wave goes mainstream<\/b><\/h3>\n<p>If 2023\u201324 was about Llama and Mistral, 2025 belongs to China\u2019s open-weight ecosystem.<\/p>\n<p>A study from MIT and Hugging Face found that <a href=\"https:\/\/www.ft.com\/content\/931c8218-a9d7-4cbd-8b08-27516637ff41?utm_source=chatgpt.com\">China now slightly leads the U.S. in global open-model downloads<\/a>, largely thanks to DeepSeek and Alibaba\u2019s Qwen family. <\/p>\n<p>Highlights:<\/p>\n<ul>\n<li>\n<p><b>DeepSeek-R1 <\/b><a href=\"https:\/\/venturebeat.com\/ai\/why-everyone-in-ai-is-freaking-out-about-deepseek\">dropped in January<\/a> as an open-source reasoning model rivaling OpenAI\u2019s o1, with MIT-licensed weights and a family of distilled smaller models. VentureBeat has followed the story from its release to its <a href=\"https:\/\/venturebeat.com\/security\/deepseek-helps-speed-up-threat-detection-while-raising-national-security-concerns\">cybersecurity impact<\/a> to <a href=\"https:\/\/venturebeat.com\/ai\/holy-smokes-a-new-200-faster-deepseek-r1-0528-variant-appears-from-german-lab-tng-technology-consulting-gmbh?utm_source=chatgpt.com\">performance-tuned R1 variants<\/a>.<\/p>\n<\/li>\n<li>\n<p><b>Kimi K2 Thinking <\/b>from Moonshot, a \u201cthinking\u201d open-source model that reasons step-by-step with tools, very much in the o1\/R1 mold, and is positioned as <a href=\"https:\/\/venturebeat.com\/ai\/moonshots-kimi-k2-thinking-emerges-as-leading-open-source-ai-outperforming\">the best open reasoning model so far in the world.<\/a><\/p>\n<\/li>\n<li>\n<p><b>Z.ai<\/b> shipped <a href=\"https:\/\/venturebeat.com\/ai\/chinese-startup-z-ai-launches-powerful-open-source-glm-4-5-model-family-with-powerpoint-creation\">GLM-4.5 and GLM-4.5-Air<\/a> as \u201cagentic\u201d models, open-sourcing base and hybrid reasoning variants on GitHub.<\/p>\n<\/li>\n<li>\n<p>Baidu\u2019s <b>ERNIE 4.5 <\/b>family arrived as a fully open-sourced, multimodal MoE suite under Apache 2.0, including a 0.3B dense model and visual \u201c<a href=\"https:\/\/venturebeat.com\/ai\/baidu-just-dropped-an-open-source-multimodal-ai-that-it-claims-beats-gpt-5\">Thinking<\/a>\u201d variants focused on charts, STEM, and tool use.<\/p>\n<\/li>\n<li>\n<p>Alibaba\u2019s <b>Qwen3 <\/b>line \u2014 including Qwen3-Coder, large reasoning models, and the Qwen3-VL series released over the summer and fall months of 2025 \u2014 continues to set a high bar for open weights in coding, translation, and multimodal reasoning, leading me to declare this past summer as &#8220;<\/p>\n<p><a href=\"https:\/\/venturebeat.com\/ai\/its-qwens-summer-new-open-source-qwen3-235b-a22b-thinking-2507-tops-openai-gemini-reasoning-models-on-key-benchmarks\">Qwen&#8217;s summer.<\/a>&#8220;<\/p>\n<\/li>\n<\/ul>\n<p>VentureBeat has been tracking these shifts, including Chinese math and reasoning models like <a href=\"https:\/\/venturebeat.com\/ai\/new-open-source-math-model-light-r1-32b-surpasses-equivalent-deepseek-performance-with-only-1000-in-training-costs?utm_source=chatgpt.com\">Light-R1-32B<\/a> and Weibo\u2019s tiny <a href=\"https:\/\/venturebeat.com\/ai\/weibos-new-open-source-ai-model-vibethinker-1-5b-outperforms-deepseek-r1-on\">VibeThinker-1.5B<\/a>, which beat DeepSeek baselines on shoestring training budgets.<\/p>\n<p>If you care about open ecosystems or on-premise options, this is the year China\u2019s open-weight scene stopped being a curiosity and became a serious alternative.<\/p>\n<h3><b>3. Small and local models grow up<\/b><\/h3>\n<p>Another thing I\u2019m thankful for: we\u2019re finally getting <i>good<\/i> small models, not just toys.<\/p>\n<p>Liquid AI spent 2025 pushing its Liquid Foundation Models (LFM2) and <a href=\"https:\/\/venturebeat.com\/ai\/liquid-ai-wants-to-give-smartphones-small-fast-ai-that-can-see-with-new-lfm2-vl-model\">LFM2-VL vision-language variants<\/a>, designed from day one for low-latency, device-aware deployments \u2014 edge boxes, robots, and constrained servers, not just giant clusters. The newer <a href=\"https:\/\/www.liquid.ai\/blog\/lfm2-vl-3b-a-new-efficient-vision-language-for-the-edge\">LFM2-VL-3B<\/a> targets embedded robotics and industrial autonomy, with demos planned at ROSCon. <\/p>\n<p>On the big-tech side, <a href=\"https:\/\/venturebeat.com\/ai\/google-unveils-open-source-gemma-3-model-with-128k-context-window\">Google\u2019s Gemma 3 line<\/a> made a strong case that \u201ctiny\u201d can still be capable. Gemma 3 spans from 270M parameters up through 27B, all with open weights and multimodal support in the larger variants. <\/p>\n<p>The standout is Gemma 3 270M, a compact model purpose-built for fine-tuning and structured text tasks \u2014 think custom formatters, routers, and watchdogs \u2014 covered both in Google\u2019s developer blog and community discussions in local-LLM circles. <\/p>\n<p>These models may never trend on X, but they\u2019re exactly what you need for privacy-sensitive workloads, offline workflows, thin-client devices, and \u201cagent swarms\u201d where you don\u2019t want every tool call hitting a giant frontier LLM.<\/p>\n<h3><b>4. Meta + Midjourney: aesthetics as a service<\/b><\/h3>\n<p>One of the stranger twists this year: Meta partnered with Midjourney instead of simply trying to beat it.<\/p>\n<p>In August, Meta announced a deal to license Midjourney\u2019s \u201caesthetic technology\u201d \u2014 its image and video generation stack \u2014 and integrate it into Meta\u2019s future models and products, from Facebook and Instagram feeds to Meta AI features.<\/p>\n<p>VentureBeat covered the partnership in \u201c<a href=\"https:\/\/venturebeat.com\/ai\/meta-is-partnering-with-midjourney-and-will-license-its-technology-for-future-models-and-products\">Meta is partnering with Midjourney and will license its technology for future models and products<\/a>,\u201d raising the obvious question: does this slow or reshape Midjourney\u2019s own API roadmap? Still awaiting an answer there, but unfortunately, stated plans for an API release have yet to materialize, suggesting that it has. <\/p>\n<p>For creators and brands, though, the immediate implication is simple: Midjourney-grade visuals start to show up in mainstream social tools instead of being locked away in a Discord bot. That could normalize higher-quality AI art for a much wider audience \u2014 and force rivals like OpenAI, Google, and Black Forest Labs to keep raising the bar.<\/p>\n<h3><b>5. Google\u2019s Gemini 3 and Nano Banana Pro<\/b><\/h3>\n<p>Google tried to answer GPT-5 with Gemini 3, billed as its most capable model yet, with better reasoning, coding, and multimodal understanding, plus a new Deep Think mode for slow, hard problems. <\/p>\n<p>VentureBeat\u2019s coverage, \u201c<a href=\"https:\/\/venturebeat.com\/ai\/google-unveils-gemini-3-claiming-the-lead-in-math-science-multimodal-and\">Google unveils Gemini 3 claiming the lead in math, science, multimodal and agentic AI<\/a>,\u201d framed it as a direct shot at frontier benchmarks and agentic workflows. <\/p>\n<p>But the surprise hit is <a href=\"https:\/\/venturebeat.com\/ai\/googles-upgraded-nano-banana-pro-ai-image-model-hailed-as-absolutely-bonkers\">Nano Banana Pro (Gemini 3 Pro Image), Google\u2019s new flagship image generator<\/a>. It specializes in infographics, diagrams, multi-subject scenes, and multilingual text that actually renders legibly across 2K and 4K resolutions. <\/p>\n<p>In the world of enterprise AI \u2014 where charts, product schematics, and \u201cexplain this system visually\u201d images matter more than fantasy dragons \u2014 that\u2019s a big deal.<\/p>\n<h3><b>6. Wild cards I\u2019m keeping an eye on<\/b><\/h3>\n<p>A few more releases I\u2019m thankful for, even if they don\u2019t fit neatly into one bucket:<\/p>\n<ul>\n<li>\n<p><b>Black Forest Labs\u2019 Flux.2<\/b> image models, which launched just earlier this week with ambitions to challenge both Nano Banana Pro and Midjourney on quality and control. VentureBeat dug into the details in \u201c<a href=\"https:\/\/venturebeat.com\/ai\/black-forest-labs-launches-flux-2-ai-image-models-to-challenge-nano-banana\">Black Forest Labs launches Flux.2 AI image models to challenge Nano Banana Pro and Midjourney<\/a>.&#8221;<\/p>\n<\/li>\n<li>\n<p><b>Anthropic\u2019s Claude Opus 4.5<\/b>, a new flagship that aims for cheaper, more capable coding and long-horizon task execution, covered in \u201c<a href=\"https:\/\/venturebeat.com\/ai\/anthropics-claude-opus-4-5-is-here-cheaper-ai-infinite-chats-and-coding-skills-that-beat-humans\">Anthropic\u2019s Claude Opus 4.5 is here: Cheaper AI, infinite chats, and coding skills that beat humans<\/a>.&#8221; <\/p>\n<\/li>\n<li>\n<p>A steady drumbeat of open math\/reasoning models \u2014 from Light-R1 to VibeThinker and others \u2014 that show you don\u2019t need $100M training runs to move the needle.<\/p>\n<\/li>\n<\/ul>\n<h3><b>Last thought (for now)<\/b><\/h3>\n<p>If 2024 was the year of \u201cone big model in the cloud,\u201d 2025 is the year the map exploded: multiple frontiers at the top, China taking the lead in open models, small and efficient systems maturing fast, and creative ecosystems like Midjourney getting pulled into big-tech stacks.<\/p>\n<p>I\u2019m thankful not just for any single model, but for the fact that we now have <i>options<\/i> \u2014 closed and open, local and hosted, reasoning-first and media-first. For journalists, builders, and enterprises, that diversity is the real story of 2025.<\/p>\n<p>Happy holidays and best to you and your loved ones!<\/p>","protected":false},"excerpt":{"rendered":"<p>Hello, dear readers. Happy belated Thanksgiving and Black Friday! This year has felt like living inside a permanent DevDay. Every week, some lab drops a new model, a new agent framework, or a new \u201cthis changes everything\u201d demo. It\u2019s overwhelming. But it\u2019s also the first year I\u2019ve felt like AI is finally diversifying \u2014 not just one or two frontier models in the cloud, but a whole ecosystem: open and closed, giant and tiny, Western and Chinese, cloud and local. So for this Thanksgiving edition, here\u2019s what I\u2019m genuinely thankful for in AI in 2025 \u2014 the releases that feel like they\u2019ll matter in 12\u201324 months, not just during this week\u2019s hype cycle. 1. OpenAI kept shipping strong: GPT-5, GPT-5.1, Atlas, Sora 2 and open weights As the company that undeniably birthed the &#8220;generative AI&#8221; era with its viral hit product ChatGPT in late 2022, OpenAI arguably had among the hardest tasks of any AI company in 2025: continue its growth trajectory even as well-funded competitors like Google with its Gemini models and other startups like Anthropic fielded their own highly competitive offerings. Thankfully, OpenAI rose to the challenge and then some. Its headline act was GPT-5, unveiled in August as the next frontier reasoning model, followed in November by GPT-5.1 with new Instant and Thinking variants that dynamically adjust how much \u201cthinking time\u201d they spend per task. In practice, GPT-5\u2019s launch was bumpy \u2014 VentureBeat documented early math and coding failures and a cooler-than-expected community reaction in \u201cOpenAI\u2019s GPT-5 rollout is not going smoothly,&#8221; but it quickly course corrected based on user feedback and, as a daily user of this model, I&#8217;m personally pleased with it and impressed with it. At the same time, enterprises actually using the models are reporting solid gains. ZenDesk Global, for example, says GPT-5-powered agents now resolve more than half of customer tickets, with some customers seeing 80\u201390% resolution rates. That\u2019s the quiet story: these models may not always impress the chattering classes on X, but they\u2019re starting to move real KPIs. On the tooling side, OpenAI finally gave developers a serious AI engineer with GPT-5.1-Codex-Max, a new coding model that can run long, agentic workflows and is already the default in OpenAI\u2019s Codex environment. VentureBeat covered it in detail in \u201cOpenAI debuts GPT-5.1-Codex-Max coding model and it already completed a 24-hour task internally.\u201d Then there\u2019s ChatGPT Atlas, a full browser with ChatGPT baked into the chrome itself \u2014 sidebar summaries, on-page analysis, and search tightly integrated into regular browsing. It\u2019s the clearest sign yet that \u201cassistant\u201d and \u201cbrowser\u201d are on a collision course. On the media side, Sora 2 turned the original Sora video demo into a full video-and-audio model with better physics, synchronized sound and dialogue, and more control over style and shot structure, plus a dedicated Sora app with a full fledged social networking component, allowing any user to create their own TV network in their pocket. Finally \u2014 and maybe most symbolically \u2014 OpenAI released gpt-oss-120B and gpt-oss-20B, open-weight MoE reasoning models under an Apache 2.0\u2013style license. Whatever you think of their quality (and early open-source users have been loud about their complaints), this is the first time since GPT-2 that OpenAI has put serious weights into the public commons. 2. China\u2019s open-source wave goes mainstream If 2023\u201324 was about Llama and Mistral, 2025 belongs to China\u2019s open-weight ecosystem. A study from MIT and Hugging Face found that China now slightly leads the U.S. in global open-model downloads, largely thanks to DeepSeek and Alibaba\u2019s Qwen family. Highlights: DeepSeek-R1 dropped in January as an open-source reasoning model rivaling OpenAI\u2019s o1, with MIT-licensed weights and a family of distilled smaller models. VentureBeat has followed the story from its release to its cybersecurity impact to performance-tuned R1 variants. Kimi K2 Thinking from Moonshot, a \u201cthinking\u201d open-source model that reasons step-by-step with tools, very much in the o1\/R1 mold, and is positioned as the best open reasoning model so far in the world. Z.ai shipped GLM-4.5 and GLM-4.5-Air as \u201cagentic\u201d models, open-sourcing base and hybrid reasoning variants on GitHub. Baidu\u2019s ERNIE 4.5 family arrived as a fully open-sourced, multimodal MoE suite under Apache 2.0, including a 0.3B dense model and visual \u201cThinking\u201d variants focused on charts, STEM, and tool use. Alibaba\u2019s Qwen3 line \u2014 including Qwen3-Coder, large reasoning models, and the Qwen3-VL series released over the summer and fall months of 2025 \u2014 continues to set a high bar for open weights in coding, translation, and multimodal reasoning, leading me to declare this past summer as &#8220; Qwen&#8217;s summer.&#8220; VentureBeat has been tracking these shifts, including Chinese math and reasoning models like Light-R1-32B and Weibo\u2019s tiny VibeThinker-1.5B, which beat DeepSeek baselines on shoestring training budgets. If you care about open ecosystems or on-premise options, this is the year China\u2019s open-weight scene stopped being a curiosity and became a serious alternative. 3. Small and local models grow up Another thing I\u2019m thankful for: we\u2019re finally getting good small models, not just toys. Liquid AI spent 2025 pushing its Liquid Foundation Models (LFM2) and LFM2-VL vision-language variants, designed from day one for low-latency, device-aware deployments \u2014 edge boxes, robots, and constrained servers, not just giant clusters. The newer LFM2-VL-3B targets embedded robotics and industrial autonomy, with demos planned at ROSCon. On the big-tech side, Google\u2019s Gemma 3 line made a strong case that \u201ctiny\u201d can still be capable. Gemma 3 spans from 270M parameters up through 27B, all with open weights and multimodal support in the larger variants. The standout is Gemma 3 270M, a compact model purpose-built for fine-tuning and structured text tasks \u2014 think custom formatters, routers, and watchdogs \u2014 covered both in Google\u2019s developer blog and community discussions in local-LLM circles. These models may never trend on X, but they\u2019re exactly what you need for privacy-sensitive workloads, offline workflows, thin-client devices, and \u201cagent swarms\u201d where you don\u2019t want every tool call hitting a giant frontier LLM. 4. Meta + Midjourney: aesthetics as a service One of the stranger twists this year: Meta<\/p>","protected":false},"author":2,"featured_media":54262,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"pmpro_default_level":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"_pvb_checkbox_block_on_post":false,"footnotes":""},"categories":[52,5,7,1],"tags":[],"class_list":["post-54261","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-club","category-committee","category-news","category-uncategorized","pmpro-has-access"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.3 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>What to be thankful for in AI in 2025 - YouZum<\/title>\n<meta name=\"description\" content=\"\u0e01\u0e34\u0e08\u0e01\u0e23\u0e23\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e42\u0e14\u0e23\u0e19\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/youzum.net\/fr\/what-to-be-thankful-for-in-ai-in-2025\/\" \/>\n<meta property=\"og:locale\" content=\"fr_FR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"What to be thankful for in AI in 2025 - YouZum\" \/>\n<meta property=\"og:description\" content=\"\u0e01\u0e34\u0e08\u0e01\u0e23\u0e23\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e42\u0e14\u0e23\u0e19\" \/>\n<meta property=\"og:url\" content=\"https:\/\/youzum.net\/fr\/what-to-be-thankful-for-in-ai-in-2025\/\" \/>\n<meta property=\"og:site_name\" content=\"YouZum\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/DroneAssociationTH\/\" \/>\n<meta property=\"article:published_time\" content=\"2025-11-29T08:19:51+00:00\" \/>\n<meta name=\"author\" content=\"admin NU\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u00c9crit par\" \/>\n\t<meta name=\"twitter:data1\" content=\"admin NU\" \/>\n\t<meta name=\"twitter:label2\" content=\"Dur\u00e9e de lecture estim\u00e9e\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/\"},\"author\":{\"name\":\"admin NU\",\"@id\":\"https:\/\/yousum.gpucore.co\/#\/schema\/person\/97fa48242daf3908e4d9a5f26f4a059c\"},\"headline\":\"What to be thankful for in AI in 2025\",\"datePublished\":\"2025-11-29T08:19:51+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/\"},\"wordCount\":1540,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/yousum.gpucore.co\/#organization\"},\"image\":{\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png\",\"articleSection\":[\"AI\",\"Committee\",\"News\",\"Uncategorized\"],\"inLanguage\":\"fr-FR\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/\",\"url\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/\",\"name\":\"What to be thankful for in AI in 2025 - YouZum\",\"isPartOf\":{\"@id\":\"https:\/\/yousum.gpucore.co\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png\",\"datePublished\":\"2025-11-29T08:19:51+00:00\",\"description\":\"\u0e01\u0e34\u0e08\u0e01\u0e23\u0e23\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e42\u0e14\u0e23\u0e19\",\"breadcrumb\":{\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#breadcrumb\"},\"inLanguage\":\"fr-FR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage\",\"url\":\"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png\",\"contentUrl\":\"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png\",\"width\":300,\"height\":200},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/youzum.net\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"What to be thankful for in AI in 2025\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/yousum.gpucore.co\/#website\",\"url\":\"https:\/\/yousum.gpucore.co\/\",\"name\":\"YouSum\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/yousum.gpucore.co\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/yousum.gpucore.co\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"fr-FR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/yousum.gpucore.co\/#organization\",\"name\":\"Drone Association Thailand\",\"url\":\"https:\/\/yousum.gpucore.co\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\/\/yousum.gpucore.co\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/youzum.net\/wp-content\/uploads\/2024\/11\/tranparent-logo.png\",\"contentUrl\":\"https:\/\/youzum.net\/wp-content\/uploads\/2024\/11\/tranparent-logo.png\",\"width\":300,\"height\":300,\"caption\":\"Drone Association Thailand\"},\"image\":{\"@id\":\"https:\/\/yousum.gpucore.co\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/DroneAssociationTH\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/yousum.gpucore.co\/#\/schema\/person\/97fa48242daf3908e4d9a5f26f4a059c\",\"name\":\"admin NU\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\/\/yousum.gpucore.co\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/youzum.net\/wp-content\/uploads\/avatars\/2\/1746849356-bpfull.png\",\"contentUrl\":\"https:\/\/youzum.net\/wp-content\/uploads\/avatars\/2\/1746849356-bpfull.png\",\"caption\":\"admin NU\"},\"url\":\"https:\/\/youzum.net\/fr\/members\/adminnu\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"What to be thankful for in AI in 2025 - YouZum","description":"\u0e01\u0e34\u0e08\u0e01\u0e23\u0e23\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e42\u0e14\u0e23\u0e19","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/youzum.net\/fr\/what-to-be-thankful-for-in-ai-in-2025\/","og_locale":"fr_FR","og_type":"article","og_title":"What to be thankful for in AI in 2025 - YouZum","og_description":"\u0e01\u0e34\u0e08\u0e01\u0e23\u0e23\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e42\u0e14\u0e23\u0e19","og_url":"https:\/\/youzum.net\/fr\/what-to-be-thankful-for-in-ai-in-2025\/","og_site_name":"YouZum","article_publisher":"https:\/\/www.facebook.com\/DroneAssociationTH\/","article_published_time":"2025-11-29T08:19:51+00:00","author":"admin NU","twitter_card":"summary_large_image","twitter_misc":{"\u00c9crit par":"admin NU","Dur\u00e9e de lecture estim\u00e9e":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#article","isPartOf":{"@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/"},"author":{"name":"admin NU","@id":"https:\/\/yousum.gpucore.co\/#\/schema\/person\/97fa48242daf3908e4d9a5f26f4a059c"},"headline":"What to be thankful for in AI in 2025","datePublished":"2025-11-29T08:19:51+00:00","mainEntityOfPage":{"@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/"},"wordCount":1540,"commentCount":0,"publisher":{"@id":"https:\/\/yousum.gpucore.co\/#organization"},"image":{"@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage"},"thumbnailUrl":"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png","articleSection":["AI","Committee","News","Uncategorized"],"inLanguage":"fr-FR","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/","url":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/","name":"What to be thankful for in AI in 2025 - YouZum","isPartOf":{"@id":"https:\/\/yousum.gpucore.co\/#website"},"primaryImageOfPage":{"@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage"},"image":{"@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage"},"thumbnailUrl":"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png","datePublished":"2025-11-29T08:19:51+00:00","description":"\u0e01\u0e34\u0e08\u0e01\u0e23\u0e23\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e42\u0e14\u0e23\u0e19","breadcrumb":{"@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#breadcrumb"},"inLanguage":"fr-FR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/"]}]},{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#primaryimage","url":"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png","contentUrl":"https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png","width":300,"height":200},{"@type":"BreadcrumbList","@id":"https:\/\/youzum.net\/what-to-be-thankful-for-in-ai-in-2025\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/youzum.net\/"},{"@type":"ListItem","position":2,"name":"What to be thankful for in AI in 2025"}]},{"@type":"WebSite","@id":"https:\/\/yousum.gpucore.co\/#website","url":"https:\/\/yousum.gpucore.co\/","name":"YouSum","description":"","publisher":{"@id":"https:\/\/yousum.gpucore.co\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/yousum.gpucore.co\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"fr-FR"},{"@type":"Organization","@id":"https:\/\/yousum.gpucore.co\/#organization","name":"Drone Association Thailand","url":"https:\/\/yousum.gpucore.co\/","logo":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/yousum.gpucore.co\/#\/schema\/logo\/image\/","url":"https:\/\/youzum.net\/wp-content\/uploads\/2024\/11\/tranparent-logo.png","contentUrl":"https:\/\/youzum.net\/wp-content\/uploads\/2024\/11\/tranparent-logo.png","width":300,"height":300,"caption":"Drone Association Thailand"},"image":{"@id":"https:\/\/yousum.gpucore.co\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/DroneAssociationTH\/"]},{"@type":"Person","@id":"https:\/\/yousum.gpucore.co\/#\/schema\/person\/97fa48242daf3908e4d9a5f26f4a059c","name":"admin NU","image":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/yousum.gpucore.co\/#\/schema\/person\/image\/","url":"https:\/\/youzum.net\/wp-content\/uploads\/avatars\/2\/1746849356-bpfull.png","contentUrl":"https:\/\/youzum.net\/wp-content\/uploads\/avatars\/2\/1746849356-bpfull.png","caption":"admin NU"},"url":"https:\/\/youzum.net\/fr\/members\/adminnu\/"}]}},"rttpg_featured_image_url":{"full":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"landscape":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"portraits":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"thumbnail":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt-150x150.png",150,150,true],"medium":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"large":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"1536x1536":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"2048x2048":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"trp-custom-language-flag":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt-18x12.png",18,12,true],"woocommerce_thumbnail":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"woocommerce_single":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt.png",300,200,false],"woocommerce_gallery_thumbnail":["https:\/\/youzum.net\/wp-content\/uploads\/2025\/11\/ChatGPT_Image_Nov_28__2025__10_42_26_AM-P3sBgt-100x100.png",100,100,true]},"rttpg_author":{"display_name":"admin NU","author_link":"https:\/\/youzum.net\/fr\/members\/adminnu\/"},"rttpg_comment":0,"rttpg_category":"<a href=\"https:\/\/youzum.net\/fr\/category\/ai-club\/\" rel=\"category tag\">AI<\/a> <a href=\"https:\/\/youzum.net\/fr\/category\/committee\/\" rel=\"category tag\">Committee<\/a> <a href=\"https:\/\/youzum.net\/fr\/category\/news\/\" rel=\"category tag\">News<\/a> <a href=\"https:\/\/youzum.net\/fr\/category\/uncategorized\/\" rel=\"category tag\">Uncategorized<\/a>","rttpg_excerpt":"Hello, dear readers. Happy belated Thanksgiving and Black Friday! This year has felt like living inside a permanent DevDay. Every week, some lab drops a new model, a new agent framework, or a new \u201cthis changes everything\u201d demo. It\u2019s overwhelming. But it\u2019s also the first year I\u2019ve felt like AI is finally diversifying \u2014 not\u2026","_links":{"self":[{"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/posts\/54261","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/comments?post=54261"}],"version-history":[{"count":0,"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/posts\/54261\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/media\/54262"}],"wp:attachment":[{"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/media?parent=54261"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/categories?post=54261"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/youzum.net\/fr\/wp-json\/wp\/v2\/tags?post=54261"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}