Spark in me
2.2K subscribers
829 photos
48 videos
116 files
2.68K links
Lost like tears in rain. DS, ML, a bit of philosophy and math. No bs or ads.
Download Telegram
Алиса - рэп на злобу дня
Ох, я помню, как-то определенные МЛ каналы призывали донатить Медузе как некоторой "проксе".

Закон обратной силы конечно не имеет, но кажется рекурсия начала закрываться понемногу.
Forwarded from Синодов пишет (Yury Sinodov)
Имейте в виду

2. Предоставление или сбор средств либо оказание финансовых услуг, заведомо предназначенных для обеспечения деятельности иностранной или международной неправительственной организации, в отношении которой принято решение о признании нежелательной на территории Российской Федерации ее деятельности в соответствии с законодательством Российской Федерации, -

наказывается обязательными работами на срок до трехсот шестидесяти часов, либо принудительными работами на срок до четырех лет с ограничением свободы на срок до двух лет или без такового, либо лишением свободы на срок от одного года до пяти лет с лишением права занимать определенные должности или заниматься определенной деятельностью на срок до десяти лет или без такового.

Дискас?
AI Psychosis

- https://blog.piekniewski.info/2023/02/07/ai-psychosis/

Software engineers. GPT can write code. This is somewhat exciting and in a sweet salty fast food news fashion the media and the tech bros interpreted that as the end of software engineering profession. So I asked it two write some code. And in all but the very rudimentary functions it was littered with bugs and in more than half of the cases wasn't even in the right vicinity. There might be some uses, e.g. GPT seems to be pretty decent at generating docstrings, but as with all these examples above "replacing software engineers" is a pipe dream hyperbole. It's worth recalling here that when four color theorem was solved in the late 70's using a computer, people were in similar fashion fearing that mathematicians are about to become irrelevant, and now, some 50 years later we can easily see how that ended.

I am too so fed up with this shit.
Though merging 2 images for an article art is a godsend.
If you have not lived in Russia, in some cities bureaucrats, cashiers, bank clerks are not necessary for 90% of mundane operations (which gives people more time to attend to the rest 10% of operations). We simply have best online banking and online government services.

In some supermarkets there are even self-checkout machines ... and they just work without sophisticated tracking and 10x cameras and tags. It just works on honour system and just regular surveillance.

Especially I was shocked when I found them in a lame-ass supermarket in my home town. Of course there were people to monitor the process and help, but it was 1-2 people per 10 self-checkout machines.
О человеке судят по его друзьям

Интересное подтверждение тезиса про "о человеке судят по его друзьям".

Несколько лет подряд я подавал наши публичные статьи про синтез на конкурсы Хабра. И ... самая годная статья с 200+ рейтинга ... по словам админов Хабра просто случайно оттуда пропала! Ну ... это был не бан. Она просто случайно пропала. Верьте нам! Мне даже не надо преувеличивать.

При этом в тот раз, когда она пропала, лучшая статья в этой категории конечно была про шёпот Алисы, ага.

Сейчас Хабр вроде разродился экспертами из комьюнити на каждую категорию, но ... первый в списке там это человек, публично призывавший донатить Медузе, а второй - пишущий длинные опусы про смерть нашей науки и почему студенты в лабах не должны получать нормально денег.

Много, много чудных совпадений!

Хотя что я. Хабр теперь это иностранное СМИ с редакцией в несколько десятков человек и плавно снижающимися показателями. Странно было бы, если бы они гребли в обратную сторону.

Вопрос только в том, насколько далеко они зайдут.
An interesting perspective here. What if LLMs are viewed though the lens of Microsoft willing to take some part of the search market?

Trends in the dollar training cost of machine learning systems - https://epochai.org/blog/trends-in-the-dollar-training-cost-of-machine-learning-systems
The Inference Cost Of Search Disruption – Large Language Model Cost Analysis - https://www.semianalysis.com/p/the-inference-cost-of-search-disruption
The AI Brick Wall – A Practical Limit For Scaling Dense Transformer Models, and How GPT 4 Will Break Past It - https://www.semianalysis.com/p/the-ai-brick-wall-a-practical-limit
Training Compute-Optimal Large Language Models - https://arxiv.org/pdf/2203.15556.pdf
😱 How Nvidia’s CUDA Monopoly In Machine Learning Is Breaking - OpenAI Triton And PyTorch 2.0 - https://www.semianalysis.com/p/nvidiaopenaitritonpytorch

TLDR

- Nvidia's dominant position in this field, mainly due to its software moat, is being disrupted;

- PyTorch won the hearts of researchers and small / large firms;

- Nvidia’s FLOPS have increased multiple orders of magnitude by leveraging Moore’s Law, but primarily architectural changes such as the tensor core and lower precision floating point formats. In contrast, memory has not followed the same path;

- The next step down in the memory hierarchy is tightly coupled off-chip memory, DRAM. DRAM followed the path of Moore’s Law for many decades. Since ~2012 though, the cost of DRAM has barely improved;

- Comparing Nvidia’s 2016 P100 GPU to their 2022 H100 GPU that is just starting to ship, there is a 5x increase in memory capacity (16GB -> 80GB) but a 46x increase in FP16 performance (21.2 TFLOPS -> 989.5 TFLOPS).

- From the current generation A100 to the next generation H100, the FLOPS grow by more than 6X, but memory bandwidth only grows by 1.65x;

- One of the principal optimization methods for a model executed in Eager mode is called operator fusion, this optimization often involves writing custom CUDA kernels;

- The growth in operators and position as the default has helped Nvidia as each operator was quickly optimized for their architecture but not for any other hardware. If an AI hardware startup wanted to fully implement PyTorch, that meant supporting the growing list of 2,000 operators natively with high performance;

- PyTorch 2.0 brings many changes, but the primary difference is that it adds a compiled solution that supports a graph execution model;

- OpenAI’s Triton is very disruptive angle to Nvidia’s closed-source software moat for machine learning. Triton takes in Python directly or feeds through the PyTorch Inductor stack. The latter will be the most common use case. Triton then converts the input to an LLVM intermediate representation and then generates code. In the case of Nvidia GPUs, it directly generates PTX code, skipping Nvidia’s closed-source CUDA libraries, such as cuBLAS, in favor of open-source libraries, such as cutlass. The Triton kernels themselves are quite legible to the typical ML researcher which is huge for usability;
Инструменты модерации в телеграме

Попробовал с десяток разных. В итоге ничего лучше @MissRose_bot даже на километр нет.

Был нормальный антимат бот, но умер.

Да и встроенные возможности догоняют понемногу, в принципе их уже на 95% хватает, особенно с последним релизом.
Обыкновенный Практикум но теперь и с поддержкой МинЦифры?

У МинЦифры есть программа обучения школьников программированию. По официальной информации - порядка 130к школьников 8-11 классов туда записались (или записались 200к, а начали учиться 130к, не суть, много в общем).

У Яндекса есть Яндекс Практикум. У Яндекс Практикума есть сотрудники. У сотрудников есть интересные достижения. Раз и два и три.

Наверное вы уже догадались, кто является там одним из провайдеров услуг?

Однако, интересный у МинЦифры получается код для нашего будущего!
Forwarded from Du Rove's Channel
🏆 In the last 5 years, Telegram surpassed Facebook Messenger to become the most popular cloud-based messaging app. Telegram is now second only to WhatsApp and is closing the gap year by year. No wonder our competitors are concerned! 🔥
Please open Telegram to view this post
VIEW IN TELEGRAM
⚛️ ❤️ Atomic Heart is awesome, but flawed

I very much wanted this game to be better than my favorite similar games - Doom Eternal (2020) and Prey (2017).

Sadly it is not, but for THE FIRST game from a new developer, it is just a marvel. A world-class AAA product, albeit with some game loop flaws:

- Art, setting, landscapes are flawless and awesome. Too many anachronisms though. Proper propaganda - pro science, pro piece, pro progress etc;
- The game itself is ok, tries to be an immersive sim, but fails to do. Mostly is just an OK shooter. Also it is very easy even on hard;
- Very many questionable game loop design decisions;
- Properly optimized for weak hardware, bugs are present, but not very annoying;

No spoilers, see the game for yourself.

Also you can generate voice from the game in our bot @silero_voice_bot.

Hope that DLCs will fix these issues!
🍿 Wondering what the 4 promised DLCs will be

I mostly 100% finished the game.
All my previous conclusions hold.

The game deserves praise (first major title by this developer and it is world class product), will not spoil anything, but beware:

- Lots of great content, but it is unevenly spread;
- In the end ... questionable ideology overall;
- Lots of problems with game loop mechanics design;

Funnily enough, the game gets overall good reviews on steam (~90%) despite the mass efforts to cancel it by western game media and mass attacks by Ukrainian cognitive farms.