⚛️ ❤️ Atomic Heart is awesome, but flawed
I very much wanted this game to be better than my favorite similar games - Doom Eternal (2020) and Prey (2017).
Sadly it is not, but for THE FIRST game from a new developer, it is just a marvel. A world-class AAA product, albeit with some game loop flaws:
- Art, setting, landscapes are flawless and awesome. Too many anachronisms though. Proper propaganda - pro science, pro piece, pro progress etc;
- The game itself is ok, tries to be an immersive sim, but fails to do. Mostly is just an OK shooter. Also it is very easy even on hard;
- Very many questionable game loop design decisions;
- Properly optimized for weak hardware, bugs are present, but not very annoying;
No spoilers, see the game for yourself.
Also you can generate voice from the game in our bot @silero_voice_bot.
Hope that DLCs will fix these issues!
I very much wanted this game to be better than my favorite similar games - Doom Eternal (2020) and Prey (2017).
Sadly it is not, but for THE FIRST game from a new developer, it is just a marvel. A world-class AAA product, albeit with some game loop flaws:
- Art, setting, landscapes are flawless and awesome. Too many anachronisms though. Proper propaganda - pro science, pro piece, pro progress etc;
- The game itself is ok, tries to be an immersive sim, but fails to do. Mostly is just an OK shooter. Also it is very easy even on hard;
- Very many questionable game loop design decisions;
- Properly optimized for weak hardware, bugs are present, but not very annoying;
No spoilers, see the game for yourself.
Also you can generate voice from the game in our bot @silero_voice_bot.
Hope that DLCs will fix these issues!
🍿 Wondering what the 4 promised DLCs will be
I mostly 100% finished the game.
All my previous conclusions hold.
The game deserves praise (first major title by this developer and it is world class product), will not spoil anything, but beware:
- Lots of great content, but it is unevenly spread;
- In the end ... questionable ideology overall;
- Lots of problems with game loop mechanics design;
Funnily enough, the game gets overall good reviews on steam (~90%) despite the mass efforts to cancel it by western game media and mass attacks by Ukrainian cognitive farms.
I mostly 100% finished the game.
All my previous conclusions hold.
The game deserves praise (first major title by this developer and it is world class product), will not spoil anything, but beware:
- Lots of great content, but it is unevenly spread;
- In the end ... questionable ideology overall;
- Lots of problems with game loop mechanics design;
Funnily enough, the game gets overall good reviews on steam (~90%) despite the mass efforts to cancel it by western game media and mass attacks by Ukrainian cognitive farms.
The New Gatekeepers
🗒 https://www.ben-evans.com/presentations
TLDR:
💸 The end of free money (for Americans)
- E-commerce back to its long-term trend (after COVID), same as COVID "backed" products
- First time in history, ad revenue slips for Google and Facebook
- VC financing back to trend line
- 250k tech layoffs, last 12m
- 5bn people with a smartphone
📉 Decline of old gatekeepers
- Major decline of department stores and newspaper ads (5-10x last 20 years)
- Amazon surpassed Walmart
- Ads as a % of GDP - stable, but now 50%+ is Internet
- Top10 media owners dominated by tech companies, Comcast and Disney are "small"
- Top 100 global advertisers ~ GAFA ad revenue
- The more imperialist the country - the more % of GDP as ad spend, US have 2x retail space per capita than other imperialist countries
- Amazon ads > Amazon Prime, Amazon ads > global newspaper ads
- Amazon - 7.5% of retail revenue is ads
- Fast fashion overtakes traditional retail
- Streaming titles overtake cable and broadcast, fragmentation grows, tv unbundles
- Streaming entrants match legacy players' production budgets, YouTube creators payments ~ 1 major content player
- Everything is about bundling and unbundling
🤷♂️Speculations about the future
- Implosion of next big thing (DeFi, NFT, several AI winters)
- US enterprise moving to cloud
- All of them are looking to monopolize the next paradigm, so far without results
- Meta spends US$14bn on metaverse in the last 12 months (25% of USA spend on war)
- Generative "AI" circle jerk
🗒 https://www.ben-evans.com/presentations
TLDR:
💸 The end of free money (for Americans)
- E-commerce back to its long-term trend (after COVID), same as COVID "backed" products
- First time in history, ad revenue slips for Google and Facebook
- VC financing back to trend line
- 250k tech layoffs, last 12m
- 5bn people with a smartphone
📉 Decline of old gatekeepers
- Major decline of department stores and newspaper ads (5-10x last 20 years)
- Amazon surpassed Walmart
- Ads as a % of GDP - stable, but now 50%+ is Internet
- Top10 media owners dominated by tech companies, Comcast and Disney are "small"
- Top 100 global advertisers ~ GAFA ad revenue
- The more imperialist the country - the more % of GDP as ad spend, US have 2x retail space per capita than other imperialist countries
- Amazon ads > Amazon Prime, Amazon ads > global newspaper ads
- Amazon - 7.5% of retail revenue is ads
- Fast fashion overtakes traditional retail
- Streaming titles overtake cable and broadcast, fragmentation grows, tv unbundles
- Streaming entrants match legacy players' production budgets, YouTube creators payments ~ 1 major content player
- Everything is about bundling and unbundling
🤷♂️Speculations about the future
- Implosion of next big thing (DeFi, NFT, several AI winters)
- US enterprise moving to cloud
- All of them are looking to monopolize the next paradigm, so far without results
- Meta spends US$14bn on metaverse in the last 12 months (25% of USA spend on war)
- Generative "AI" circle jerk
Benedict Evans
Presentations — Benedict Evans
Every year, I produce a big presentation exploring macro and strategic trends in the tech industry. For 2024, ‘AI, and everything else’.
Forwarded from РИА Новости
Запущенная Путиным и Собяниным Большая кольцевая линия - крупнейший в мире проект в области метростроения, в ее состав входит 31 станция, протяженность - 70 километров.
На многих маршрутах экономия времени после запуска БКЛ составит до 35-45 минут, улучшится транспортное обслуживание 34 районов Москвы.
На многих маршрутах экономия времени после запуска БКЛ составит до 35-45 минут, улучшится транспортное обслуживание 34 районов Москвы.
Atomic Heart OST
Mick Gordon is nowhere to be seen in credits.
I wonder why ...
https://youtu.be/R1eyjhTmErw
Mick Gordon is nowhere to be seen in credits.
I wonder why ...
https://youtu.be/R1eyjhTmErw
YouTube
Atomic Heart (Original Game Soundtrack) Vol.1
Atomic Heart official soundtrack list Vol.1 🔥
00:00 @Mujuice, Pesniary, Atomic Heart - Kosil Jas’ Konjushinu (Mujuice Acid Rework)
04:39 @Geoffplaysguitar, Zemlyane , Atomic Heart - Trava u Doma (Geoffrey Day Remix)
08:33 @Geoffplaysguitar, Alla Pugacheva…
00:00 @Mujuice, Pesniary, Atomic Heart - Kosil Jas’ Konjushinu (Mujuice Acid Rework)
04:39 @Geoffplaysguitar, Zemlyane , Atomic Heart - Trava u Doma (Geoffrey Day Remix)
08:33 @Geoffplaysguitar, Alla Pugacheva…
Если судить по телеграму, 50-75% IT / МЛ комьюнити в первые дни активно топило донатить Медузе и прочим организациям со звездочкой.
Сейчас же ситуация меняется - улыбаемся и машем, за вами следит скрытая камера.
Сейчас же ситуация меняется - улыбаемся и машем, за вами следит скрытая камера.
Forwarded from Мир сегодня с "Юрий Подоляка" (Юрий Подоляка)
Всем гражданам России, кто "донатит" деньги на ВСУ на заметку (в т.ч. и тем, которые сбежали)
"ФСБ России во взаимодействии с МВД РФ задержала жительницу Москвы за госизмену в виде финансовой помощи Вооруженным силам Украины".
Согласно статьи 275 УК РФ за это преступление предусмотрено наказание на срок от 12 до 20 лет лишения свободы со штрафом до пятисот тысяч рублей или дохода осужденного за период до трех лет.
Кстати, это касается в т.ч. и тех россиян, которые уехали из страны и сегодня рассказывают о том, что они помогают ВСУ.
Так вот (специально для них) этот случай показывает, что В ЭТУ РОССИЮ ВЫ БОЛЬШЕ НИКОГДА НЕ ВЕРНЕТЕСЬ, РАЗВЕ ЧТО ЧЕРЕЗ ТЮРЬМУ!!!
"ФСБ России во взаимодействии с МВД РФ задержала жительницу Москвы за госизмену в виде финансовой помощи Вооруженным силам Украины".
Согласно статьи 275 УК РФ за это преступление предусмотрено наказание на срок от 12 до 20 лет лишения свободы со штрафом до пятисот тысяч рублей или дохода осужденного за период до трех лет.
Кстати, это касается в т.ч. и тех россиян, которые уехали из страны и сегодня рассказывают о том, что они помогают ВСУ.
Так вот (специально для них) этот случай показывает, что В ЭТУ РОССИЮ ВЫ БОЛЬШЕ НИКОГДА НЕ ВЕРНЕТЕСЬ, РАЗВЕ ЧТО ЧЕРЕЗ ТЮРЬМУ!!!
Forwarded from Data Science by ODS.ai 🦜
LLaMA: Open and Efficient Foundation Language Models
LLaMA is a set of large language models, ranging from 7B to 65B parameters, that have been trained on publicly available datasets containing trillions of tokens. The LLaMA-13B model performs better than GPT-3 (175B) on most benchmarks, and the LLaMA-65B model is competitive with other state-of-the-art models, such as Chinchilla70B and PaLM-540B. This suggests that it is possible to achieve excellent performance in language modeling without relying on proprietary or inaccessible datasets.
Paper: https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/
Code: https://github.com/facebookresearch/llama
A detailed unofficial overview of the paper: https://andlukyane.com/blog/paper-review-llama
#deeplearning #nlp #transformer #sota #languagemodel
LLaMA is a set of large language models, ranging from 7B to 65B parameters, that have been trained on publicly available datasets containing trillions of tokens. The LLaMA-13B model performs better than GPT-3 (175B) on most benchmarks, and the LLaMA-65B model is competitive with other state-of-the-art models, such as Chinchilla70B and PaLM-540B. This suggests that it is possible to achieve excellent performance in language modeling without relying on proprietary or inaccessible datasets.
Paper: https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/
Code: https://github.com/facebookresearch/llama
A detailed unofficial overview of the paper: https://andlukyane.com/blog/paper-review-llama
#deeplearning #nlp #transformer #sota #languagemodel
Forwarded from gonzo-обзоры ML статей
Hot news: https://ai.facebook.com/blog/large-language-model-llama-meta-ai/
Training smaller foundation models like LLaMA is desirable in the large language model space because it requires far less computing power and resources to test new approaches, validate others’ work, and explore new use cases. Foundation models train on a large set of unlabeled data, which makes them ideal for fine-tuning for a variety of tasks. We are making LLaMA available at several sizes (7B, 13B, 33B, and 65B parameters) and also sharing a LLAMA model card that details how we built the model in keeping with our approach to Responsible AI practices.
In particular, LLaMA-13B outperforms GPT-3 (175B) on most benchmarks, and LLaMA-65B is competitive with the best models, Chinchilla70B and PaLM-540B. We release all our models to the research community.
Model card: https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md
Paper: https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/
Form to apply: https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform
Unfortunately, it's only for non-commercial purposes :(
"You will not, and will not permit, assist or cause any third party to:
a. use, modify, copy, reproduce, create derivative works of, or distribute the Software Products (or any derivative works thereof, works incorporating the Software Products, or any data produced by the Software), in whole or in part, for (i) any commercial or production purposes ... "
Training smaller foundation models like LLaMA is desirable in the large language model space because it requires far less computing power and resources to test new approaches, validate others’ work, and explore new use cases. Foundation models train on a large set of unlabeled data, which makes them ideal for fine-tuning for a variety of tasks. We are making LLaMA available at several sizes (7B, 13B, 33B, and 65B parameters) and also sharing a LLAMA model card that details how we built the model in keeping with our approach to Responsible AI practices.
In particular, LLaMA-13B outperforms GPT-3 (175B) on most benchmarks, and LLaMA-65B is competitive with the best models, Chinchilla70B and PaLM-540B. We release all our models to the research community.
Model card: https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md
Paper: https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/
Form to apply: https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform
Unfortunately, it's only for non-commercial purposes :(
"You will not, and will not permit, assist or cause any third party to:
a. use, modify, copy, reproduce, create derivative works of, or distribute the Software Products (or any derivative works thereof, works incorporating the Software Products, or any data produced by the Software), in whole or in part, for (i) any commercial or production purposes ... "
Meta
Introducing LLaMA: A foundational, 65-billion-parameter language model
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with previously published models of a similar size on existing benchmarks.
Predictably, now it is only for the American academia. But weights are leaked ofc.
Spark in me
Обыкновенный Практикум но теперь и с поддержкой МинЦифры? У МинЦифры есть программа обучения школьников программированию. По официальной информации - порядка 130к школьников 8-11 классов туда записались (или записались 200к, а начали учиться 130к, не суть…
Яндекс продолжает лезть к детям
Как вам такое, Яндекс организует айтишную конфу для школьников 5-11 классов про то, какие классные они продукты делают.
Заход через контору Коняева. Хотели рекламироваться в нашем боте, где много детей.
Напоминаю, что его сейчас делят, и прекрасные люди из поста выше, вероятно, останутся в Израиле.
Товарищ майор, почему иностранные компании с такими тезисами (пост выше) лезут учить наших детей, а вы ничего не делаете?
Как вам такое, Яндекс организует айтишную конфу для школьников 5-11 классов про то, какие классные они продукты делают.
Заход через контору Коняева. Хотели рекламироваться в нашем боте, где много детей.
Напоминаю, что его сейчас делят, и прекрасные люди из поста выше, вероятно, останутся в Израиле.
Товарищ майор, почему иностранные компании с такими тезисами (пост выше) лезут учить наших детей, а вы ничего не делаете?