Aleksa Gordić LinkedIn Posts
Learn how to post on LinkedIn like Aleksa Gordić. Learn from their content, engagement tactics, and network growth techniques
@aleksagordic
Happy to welcome 16 new H100s to the family!! 😅 A huge thank you to Hyperstack folks for sponsoring my compute! With 32
@aleksagordic
One of the most important skills nowadays may very well be the ability to quickly search for information - and I don't just
@aleksagordic
One of the super important habits I developed over the years is taking notes and logging what I do.
Everytime I watch a video,
@aleksagordic
If you truly want to become proficient with machine learning (I really don't like the word expert) try to get out from the
@aleksagordic
[🚀 3D parallelism 🚀] In this last video of the Large Language Model series I walk you through the BigScience's BLOOM 175 billion
@aleksagordic
[API 🚀] I'm really happy to announce Runa AI's API dev platform! - you can now start building with YugoGPT API: https://dev.runaai.com/
Currently we
@aleksagordic
[exciting paper 🧠] How to Train Data-Efficient LLMs - they show that training on 60% of the original dataset, their method "ASK-LLM" is
@aleksagordic
OpenAI's technical report on Sora is out! Honestly I've never been this excited about the future of AI. And one thing is sure:
@aleksagordic
This is one of those days isn't it? 😂 DeepMind posts Gemini and then OpenAI releases a mind blowing text-to-video model Sora!!! It
@aleksagordic
Google DeepMind just announced Gemini 1.5 Pro. You can't try it yet (:sadface:) but here is a brief technical summary as the results
@aleksagordic
My talk with Jeremy Howard is now live!
YouTube: https://lnkd.in/dDP8r592
We talked about http://answer.ai - Jeremy's new R&D lab (they just raised $10M), why the incentives in the industry (OpenAI
@aleksagordic
Had a lot of people ask for it so here it is: a comparison between YugoGPT (7B) vs Aya-101 (13B) - a recent
@aleksagordic
[🐘🧠 new paper] Suppressing Pink Elephants with Direct Principle Feedback - do you know about this hack? (image) It actually works with humans
@aleksagordic
[New transformer alternatives!!! 🧠] Transformers have become the go-to architecture for LLMs. And more broadly - the whole ML field has converged on
@aleksagordic
[new paper 🧠] More Agents Is All You Need - they show a simple method to increase the accuracy of any given LLM:
@aleksagordic
You can now chat with in a nutshell – kurzgesagt YouTube channel! 🥳 https://lnkd.in/dPNh_XgZ
Kurzgesagt is probably the best channel on YouTube when it comes to
@aleksagordic
If you're still struggling to understand how transformers work here are some amazing resources! (including mine! :))
First of all 3B1B just released 2
@aleksagordic
🔥 Things are really heating up in AI:
* New Mistral 8x22B MoE (170B) model just came out - we'll see how it
@aleksagordic
Very happy to announce SlovenianGPT-Instruct! 🥳 The model significantly improves upon the base model across all of the benchmarks and is by far
@aleksagordic
Extremely cool new model release from AI21 Labs - Jamba - and it's not even a transformer! It's a hybrid model that combines