š tl;dr
Today we talk about the M11 Monochrome sensor, learning to write better prompts to get help writing better content, and we start a conversation about the future of computing.
𤩠Welcome to the 190 new Morfternighters who joined us last week.
I love having you here and hope youāll enjoy reading Morfternight.
Share with your friends by clicking this button.
š Good Morfternight!
Greetings from Vienna, Morfternighters!
We will now return to our regular service following the Morfternight Special Edition sent out last Thursday. If you happened to miss the email due to the unusual day and subject, you can access it through the link provided above.
š· Photo of the week
Work ā More Photos
Chef Marc Forgione cooking a mean Carbonara.
The scene occurred last Monday night in the cellar of Peasant, one of his NYC restaurants. It was dimly lit, and he moved as fast as professional chefs do.
Youāll generally donāt hear me talk much about technical details. You know the drill, gear doesnāt matter, the best camera is the one you have with you, yadda yadda yaddaā¦
Itās not limited to photography; you can think of other areas, like computers, cars, or cooking. Most of the time, any computer, car, pot, or pan will do the job.
Most of the time.
Every once in a while, you face a situation where the right tool can get the job done, and a more generic one wonāt.
I am still getting to know the M11 Monochrome camera, and it has already impressed me with its ability to capture nuanced scenes with excellent dynamic range.
However, the camera has exceeded my expectations in two particular areas.
Firstly, the rangefinder allowed me to capture a shot I would likely have missed had I relied on autofocus. Secondly, the low-light image quality is remarkable. I shot at 12,500 ISO and 1/30s shutter speed, and I am amazed by how little noise there is. In fact, the noise adds a beautiful touch to the image.
I have included a cropped detail at 100% for you to see.
I am looking forward to more low-light adventures with this camera! āØ
šŗļø A few places to visit
Believe it or not, weāll all become Prompt Engineers one day. Large Language Models (LLMs) make the power of computers accessible to all humans, so we should learn how to best take advantage of this. To help us with this task, OpenAI and DeepLearning.ai created a free course: ChatGPT Prompt Engineering for Developers. I just signed up, but if you want to understand better what this is about, hereās an excellent course overview.
If youāre unfamiliar with LLMs and their functions but have limited time to research, I recommend reading āHow ChatGPT and Other LLMs Workāand Where They Could Go Nextā on Wired. This article strikes a balance between superficial content and academic papers.
I am incredibly proud of our recent work to release the Jetpack AI Assistant for your WordPress site. This is why I am doing something unusual and sharing some work-related content. You can read our blog post, Meet Your New Creative Writing Partner ā The Jetpack AIĀ Assistant, to learn more. If you try it and have any feedback, please share it with me. I am listening!
š Appleās vision for the future is Vision Pro
Last week was WWDC, the Apple Developer conference, and on Monday, the traditional yearly keynote where Apple announces things other than phones and watches took place.
As itās been customary for many years, the new versions of the various operating systems were presented, from iOS, iPadOS, and TvOS 17, to WatchOS 10 and macOS 14.
A new MacBook Air 15ā, a MacPro powered by Apple Silicon, and many AI-powered features were announced (although the word āAIā is absent from the keynote), but the show's real star was the Apple Vision Pro.
At this point, if you have no idea what I am talking about, I will offer you three options to catch up before I share a couple of thoughts about this product.
If you donāt have time to watch the entire keynote, hereās an excellent summary in about 15 minutes:
If you have more time, would instead get your information from the primary source, and admire Appleās marketing machine, hereās the entire keynote:
Finally, if you have even more time and a wish to dig deeper, hereās the yearly live episode of The Talk Show with John Gruber:
All right, now that youāre caught up on the announcement, hereās what I think.
The Vision Pro reminds me of the introduction of the iPhone in 2007.
At launch, some of us were enthusiastic, and others dismissed it. The primary difference between the two groups was that while the former considered it the first step of a new journey, the latter assumed it was an over-expensive, overengineered solution to an existing problem.
I think that Appleās approach to merging augmented and virtual reality is way better than focusing on either one primarily.
As incredible as it seems, there was a time not long ago when wearing a headset and talking on the phone in the street was frowned upon.
I bet that within five years, XR headsets like the Vision (but lighter, cheaper, and with longer battery life) will be the norm whenever we are alone, at home, on public transport, or working.
Will it be a positive or negative change?
As for the phone, both are possible, depending on how we use them individually. By inventing a new way to use computers, we automatically create all the problems that go along.
Like inventing the airplane simultaneously created plane crashes.