Morfternight #56: Bears shake at 4 Hz
The one about the upcoming AI and today's Machine Learning revolutions.
🤩 Welcome to the 4 new Morfternighters who joined us last week.
We love to have you here, and I hope you’ll enjoy reading Morfternight.
If you do, remember to share with a friend by clicking on this button.
📷 Photo of the week
White Bear Shakes - More Photos
👋 Good Morfternight, friends!
One month into the northern hemisphere autumn, we still have sunny days hitting 20º Celsius (68 Fahrenheit for my imperial friends).
Long-term, that’s irrelevant anecdotal data at best and, at worst, a bad sign that climate change will strike us hard.
Medium-term, it’s good, though, as it means we haven’t turned the heating on yet, saving precious gas. Warm weather might bring energy independence from Russia sooner than later.
Short term, it meant being able to visit Vienna’s Tiergarten with friends this morning, taking some photos, including the one above of a White Bear shaking off the water after a dive.
Tonight, I discovered that the frequency at which mammals shake to get rid of excess water in their fur is the object of scientific research.
For a bear, it’s about 4Hz.
Some animals with a very slow lifestyle, like sloths, can’t shake off the water as they can’t move fast enough. Somehow, I find this a bit sad.
🗺️ Three places to visit today
With generative AI tools becoming mainstream and accelerating faster than most of us can follow to understand how they work, I have been trying to understand better what happened during the past ten years. It seemed that progress was steady but linear when instead, it was accelerating.
Maybe exponentially?
First of all, if you haven’t read it, I strongly advise checking Tim Urban’s 2015 post, The AI Revolution: The Road to Superintelligence. Yes, it’s seven years old, but don’t let that stop you, and believe me, it is still worth reading today, if not more than ever.
Then, so you don’t doubt that I am not confused, you can check this short post about why Machine Learning is NOT Artificial Intelligence.
Finally, the retrospective on what has been going on for the past ten years: 10 years later, deep learning ‘revolution’ rages on, say AI pioneers Hinton, LeCun, and Li
🎙️ Do you have some time?
Here’s something I don’t usually do: recommend a podcast episode I haven’t been yet able to listen to.
Today, though, it is different.
The Lex Fridman podcast is, without a doubt, the best podcast I have listened to, with surprising consistency across episodes, despite hosting experts on many subjects.
Balaji Srinivasan has been the guest of some of the overall best interviews on other podcasts.
With this in mind, I want to highlight immediately the latest episode of Lex Fridman’s podcast with Balaji Srinivasan, which I’ll start listening to tomorrow.
It’s eight hours long, so it’ll take me some time, and if my previous experience tells me anything, I’ll need to listen to it at least twice.