Discover more from Morfternight
Morfternight #89: Becoming a Prompt Engineer
The one with a bet on Apple Vision Pro.
Today we talk about the M11 Monochrome sensor, learning to write better prompts to get help writing better content, and we start a conversation about the future of computing.
🤩 Welcome to the 190 new Morfternighters who joined us last week.
I love having you here and hope you’ll enjoy reading Morfternight.
Share with your friends by clicking this button.
👋 Good Morfternight!
Greetings from Vienna, Morfternighters!
We will now return to our regular service following the Morfternight Special Edition sent out last Thursday. If you happened to miss the email due to the unusual day and subject, you can access it through the link provided above.
📷 Photo of the week
Work — More Photos
Chef Marc Forgione cooking a mean Carbonara.
The scene occurred last Monday night in the cellar of Peasant, one of his NYC restaurants. It was dimly lit, and he moved as fast as professional chefs do.
You’ll generally don’t hear me talk much about technical details. You know the drill, gear doesn’t matter, the best camera is the one you have with you, yadda yadda yadda…
It’s not limited to photography; you can think of other areas, like computers, cars, or cooking. Most of the time, any computer, car, pot, or pan will do the job.
Most of the time.
Every once in a while, you face a situation where the right tool can get the job done, and a more generic one won’t.
I am still getting to know the M11 Monochrome camera, and it has already impressed me with its ability to capture nuanced scenes with excellent dynamic range.
However, the camera has exceeded my expectations in two particular areas.
Firstly, the rangefinder allowed me to capture a shot I would likely have missed had I relied on autofocus. Secondly, the low-light image quality is remarkable. I shot at 12,500 ISO and 1/30s shutter speed, and I am amazed by how little noise there is. In fact, the noise adds a beautiful touch to the image.
I have included a cropped detail at 100% for you to see.
I am looking forward to more low-light adventures with this camera! ✨
🗺️ A few places to visit
Believe it or not, we’ll all become Prompt Engineers one day. Large Language Models (LLMs) make the power of computers accessible to all humans, so we should learn how to best take advantage of this. To help us with this task, OpenAI and DeepLearning.ai created a free course: ChatGPT Prompt Engineering for Developers. I just signed up, but if you want to understand better what this is about, here’s an excellent course overview.
If you’re unfamiliar with LLMs and their functions but have limited time to research, I recommend reading “How ChatGPT and Other LLMs Work—and Where They Could Go Next” on Wired. This article strikes a balance between superficial content and academic papers.
I am incredibly proud of our recent work to release the Jetpack AI Assistant for your WordPress site. This is why I am doing something unusual and sharing some work-related content. You can read our blog post, Meet Your New Creative Writing Partner — The Jetpack AI Assistant, to learn more. If you try it and have any feedback, please share it with me. I am listening!
🐘 Apple’s vision for the future is Vision Pro
Last week was WWDC, the Apple Developer conference, and on Monday, the traditional yearly keynote where Apple announces things other than phones and watches took place.
As it’s been customary for many years, the new versions of the various operating systems were presented, from iOS, iPadOS, and TvOS 17, to WatchOS 10 and macOS 14.
A new MacBook Air 15”, a MacPro powered by Apple Silicon, and many AI-powered features were announced (although the word “AI” is absent from the keynote), but the show's real star was the Apple Vision Pro.
At this point, if you have no idea what I am talking about, I will offer you three options to catch up before I share a couple of thoughts about this product.
If you don’t have time to watch the entire keynote, here’s an excellent summary in about 15 minutes:
If you have more time, would instead get your information from the primary source, and admire Apple’s marketing machine, here’s the entire keynote:
Finally, if you have even more time and a wish to dig deeper, here’s the yearly live episode of The Talk Show with John Gruber:
All right, now that you’re caught up on the announcement, here’s what I think.
The Vision Pro reminds me of the introduction of the iPhone in 2007.
At launch, some of us were enthusiastic, and others dismissed it. The primary difference between the two groups was that while the former considered it the first step of a new journey, the latter assumed it was an over-expensive, overengineered solution to an existing problem.
I think that Apple’s approach to merging augmented and virtual reality is way better than focusing on either one primarily.
As incredible as it seems, there was a time not long ago when wearing a headset and talking on the phone in the street was frowned upon.
I bet that within five years, XR headsets like the Vision (but lighter, cheaper, and with longer battery life) will be the norm whenever we are alone, at home, on public transport, or working.
Will it be a positive or negative change?
As for the phone, both are possible, depending on how we use them individually. By inventing a new way to use computers, we automatically create all the problems that go along.
Like inventing the airplane simultaneously created plane crashes.