atonse 20 hours ago

At this point, just about anything Apple can do will be way way way better than the absolute turd that is Siri. (It was only impressive 15 years ago).

Apple's AI strategy has seriously hurt their reputation. I'd love to be a fly on the wall where they discussed a strategy that amounted to "forget using the most basic LLM to understand combinations of commands like, stop all the timers and just keep the one that has about four minutes left... or turn on the lights in x, y, z room and turn off the fans around the house. let's just try to invent a completely new wheel that will get us bogged down in tech hell for years never making any progress"

They could've just improved the thing probably 99% of people use Siri for (Music, Home, Timers, Weather, Sports Scores) without developing any new tech or trying to reinvent any wheel. And in the background, continue to iterate in secret like they do best. Instead they have zero to show for two years since good LLMs have been out.

Even my son suggested things like "I wish your phone had ChatGPT and you could ask it to organize all your apps into folders" – we can all come up with really basic things they could've done so easily, with privacy built in.

  • Rebuff5007 20 hours ago

    I know this is the leading narrative, but I actually disagree.

    Apple has a wonderful set of products without any form of generative AI, and those products continues to exist. Yes there is opportunity to add some fancy natural-language based search / control, but that seems like relatively low hanging fruit compared to the moat they have and defend.

    Will adding gen ai natively to apple products have people non-trivially change the way they use iphones or macs? Probably not. So there is literally no rush here.

    • atonse 19 hours ago

      It's not about being fancy. My examples are so utterly dull.

      Being able to say "turn on the lights in the living room and turn off the fans in the kids' rooms" – is not a crazy use case.

      Instead, I literally have to say:

      - Turn on Living Room light

      - wait

      - turn off <son's name> bedroom fan

      - wait

      - turn off <daughter's name> bedroom fan

      Yes, I could actually say "turn off all the fans" (I believe Siri understands that) but that's usually not what I want.

      Another example, you have 3-4 timers going: Let's say I'm cooking and have an oven timer, but also have a timer for my kids to stop their device time. I may have a few going. But being able to say "cancel all the timers except the longest one" is TRIVIAL for a first year programmer to implement. But instead, it's a slog with Siri.

      • rcarmo 18 hours ago

        Actually, what you describe should be feasible with the new on-device foundation models (I haven’t installed the beta myself, but in my close friend group we’ve been suggesting prompts to the couple of brave people who do Switft development, and the foundation models seem able to do that).

        • atonse 22 minutes ago

          That's wonderful news. I hope it translates to being baked into the built-in apps like Home/Siri.

    • quitit 13 hours ago

      There is a consequence to shifting to LLMs. Despite Siri's reputation, it is a well used product(1), and despite HN's constant noise, Siri actually works very well for the purposes of controlling other apple devices in ways that I've noticed to be far better than Alexa (the other digital assistant that I regularly use).

      Switching that to an LLM-based represents a massive increase in computational requirements without pushing the needle for most requests. While fancy, users don't want to sit and have a verbose ChatGPT style conversation with Siri, they just want the command run and done. So this means any advertised change to Siri will need to be sufficiently large such that Siri could seemingly decode any request with minimal or no follow-up questioning, anything short of this will be largely derided, and face the same backlash as current-era Siri.

      At the moment siri answers many trivial requests without the use of an LLM. Yes you can speak to siri with relative terms or needs based requests, e.g. saying "It's dark in here" will result in siri turning on the lights in just the room where the request was made(2), even if other receivers in the house heard the request. It's also smart enough to recognise that if you label a room as something like the "office" but later made a request for the "study", it will prompt if actually meant the "office".

      The big caveat here is that Siri's abilities change based on the language selected, non-english languages appear to have less flexibility in the type of request and the syntax used. Another factor is that requests during certain peak periods appear to be handled differently, as if there are fall-back levels of AI smarts at the server level. To get around that new Siri will need to be largely offline, which appears consistent with Apple's new AI strategy of local models for basic tasks and complex requests being sent to private cloud compute.

      Like Apple Maps, I anticipate the pile-on to Siri will go on far longer than deserved, but what does seem certain is that change is coming.

      (1) Apple have stated that Siri is the most used digital assistant. However I have not found any supporting data for this claim other than Apple's own keynote address where the claim was made.

      (2) Requires that rooms are set up in homekit and there are per-room based receivers, such as a homepod in each room.

    • msgodel 17 hours ago

      This approach would be fine if users were empowered to add AI integration they wanted on their own.

      They are not though. Absolute control over the platform means Apple has the responsibility to have more vision for the future than anyone else. They do not and will fail to satisfy their users. It will result in either a dramatic change of leadership and strategy and or drive the customers elsewhere.

    • thejazzman 20 hours ago

      > Siri, open the east door

      < do you want to open - the east door - the west door - all the doors

      > Siri, open the east door < opening the east door

      They kinda really super suck. Siri used to work better than it does today. It's often a bigger chore than opening the app and tapping the button.

      These quirks hit me on a daily basis when all I want to do is control my lights and locks

      • JumpCrisscross 19 hours ago

        Turn off Apple Intelligence. I got sick of Siri asking, when I asked for the garage door to be opened, if I meant my house in Wyoming or 6th-story apartment in New York (which doesn't have a garage).

    • halJordan 20 hours ago

      Obviously reasonable minds may disagree. And i do i disagree with your disagree-al. Your reasonable response necessarily stems from a foundation that llms are just stochastic parrots incapable of non-trivially changing someone's usage. That isn't true, and has been shown to be untrue in many domains at this point. And that's only from the chatbot form of llms. Tool usage and agents will reveal more new paradigms

    • dzhiurgis 19 hours ago

      If OpenAI released a phone Apple’s sales will be down 50%.

      At this point only a handful of apps that are irreplaceable are propping iOS up and that won’t last.

      • acheong08 19 hours ago

        I highly doubt that OpenAI is capable of releasing a full phone that isn't just a reskin of a generic Android with "AI". IOS design sucks (imo) and its app selection is much less than Android but that's not what makes people buy iPhone. It's simple familiarity and marketing. I'll definitely be switching off my iPhone when it breaks but that'll probably take at least a decade. Phones are pretty much feature complete at this point - for a normal person there's almost no reason to upgrade.

      • rkomorn 19 hours ago

        What data backs this take of yours?

        What irreplaceable apps are propping up iOS? What's the data showing that 50% of iPhone users are basically just begging to get off the platform?

        • dzhiurgis 12 hours ago

          > What irreplaceable apps are propping up iOS?

          EV charging and smart ID (aka gov id). To lesser extent - anything thats a smart hardware - home automation, cameras, vacuums, cars, drones, etc. Then there’s services - namely ride hailing, escooter hire. Plenty of mobile webapps are pretty crippled too so you download apps.

          • rkomorn 6 hours ago

            I don't understand how you consider these iOS-specific? These apps are also all available on Android phones.

            If people, today, don't want to have iOS devices, they can switch to Android devices and use virtually all the same apps.

            What would OpenAI release that would affect Apple and not every other phone vendor? Or do you also think all Android sales would tank if OpenAI released a phone?

          • msgodel 12 hours ago

            Don't know about EV stuff but I don't think normal ID cards are going anywhere. All the home automation stuff I've gotten speaks UPNP, my quadcopter doesn't have an app at all AFAIK but I shopped for an open source one. All the services seem to be moving toward the web just because dealing with the review processes is annoying and all you get from it is push notifications.

  • jurgenaut23 20 hours ago

    It doesn’t matter. No one buys an iPhone for Siri and no one switches to Android for whatever they call this thing. I have owned an iPhone for more than 15 years, and I have used Siri a dozen times.

    They will implement something using GPT-4 or Claude and this whole mess will be forgotten.

    • imglorp 19 hours ago

      I would. Hands free in the car or when mowing the grass, chatting with the AI would be huge.

      "Text my wife and say I'll be late." is still too much to ask: it responds with 20 questions about all the parameters.

      "turn up the volume" does actually work for the first time, lately. (Bravo, Android).

      "open app antenna pod and begin playing" is way out of the question. Suck.

      • matt-attack 2 hours ago

        Or how about at least having a full understanding or how iOS and it settings work things like “hey Siri turn off the phone” or “ hey Siri, why am I not hearing a ring when I get phone calls?” or “why is my phone not going into silent mode when I get into bed at night”.

      • com2kid 11 hours ago

        If you rename your wife on your contact list to "my wife" it'll work!

        "Test <name> and say I'll be late" works fine. Sometimes the message gets sent, sometimes a request for confirmation is asked first. Irritating it isn't consistent.

        > "open app antenna pod and begin playing"

        This should work with the old Google assistant even, assuming the app added the proper integrations.

        • imglorp 3 hours ago

          Yeah the whole point of ML is it learns without needing an integration. "Open app X" is something a 5 year old can do (and Gemini can actually do this). But "open X and play" it gets tripped up on the media play state. That 5 year old would just look for a play button. Or if it's a bluetooth enabled app it will have an API to play. There's still a gap.

    • haiku2077 19 hours ago

      I use Siri a lot for home automation, and am frustrated by how much better it could be.

      "Turn off all the lights in the house" works, but "turn off all the lights" does not. What?!??

    • fasthands9 19 hours ago

      I do think Apple needs a better Siri, but I think ultimately they were smart not to plow tons of money into it trying to do it themselves.

      A better Siri is an expense to keep up the premium brand, not something that they will monetize. For particular uses of AI people will just want particular apps.

    • sandspar 19 hours ago

      Young people are increasingly comfortable using voice, and marketing agencies already consider Gen Alpha to be “voice native.” I once saw a small child help his grandfather with a phone issue. The grandfather fumbled with the GUI, but the child opened Siri and solved it by voice. If Apple drops the ball on voice, it may not hurt them today - but they risk losing the next decade.

      • JumpCrisscross 19 hours ago

        > Young people are increasingly comfortable using voice

        I know plenty of folks in their 40s and 50s who have used Siri as their primary way to search the internet for years.

        • dham 15 hours ago

          My mom does everything through voice on her iPhone. My son defaults to using Siri on Mac for a ton of things. He grew up with an Alexa. It's really the in between generation who learned how to master a computer at a young age that don't really use it.

      • jaredwiener 19 hours ago

        My in-laws use Siri/voice interactions almost exclusively, dictating text messages out-loud, searching for shows on Roku using the voice remote, etc.

        Even my 2.5 year old will ask Alexa and Siri to do things, sometimes far away from any device that could respond.

  • amluto 17 hours ago

    I would argue that the problem with Siri isn’t the model. Siri is perfectly fine at transcribing speech, although it seems to struggle with figuring out when an instruction ends. But Siri is awful at doing anything with even simple instructions:

    - It regularly displays an accurate transcription with exactly the same text that usually works and then sits there, apparently indefinitely, doing nothing.

    - Sometimes it’s very slow to react. This seems to be separate from the above “takes literally forever” issue.

    - Siri is apparently incapable of doing a lot of things that ought to work. For example, for years, trying to use Siri on a watch to place a call on Bluetooth (while the phone is right there) would have nonsensical effects.

    These won’t be fixed with a better model. They will be fixed with a better architecture. OpenAI and Anthopic can’t provide that except insofar as they might inspire Apple to wire useful functionality up to something like MCP to allow the model to do useful things.

    > Even my son suggested things like "I wish your phone had ChatGPT and you could ask it to organize all your apps into folders" – we can all come up with really basic things they could've done so easily, with privacy built in.

    I’m not convinced the industry knows how to expose uncontrolled data like one’s folders to an LLM without gaping exploit opportunities. Apple won’t exploit you deliberately, as that’s not their MO, but they are not immune to letting things resembling instructions that are in one of your folders exploit you.

canjobear 20 hours ago

I’m not surprised that Apple has been struggling to integrate LLMs. By their very stochastic nature they go against Apple’s philosophy of total vertical control of every aspect of their products. There’s no way to guarantee that an LLM will do anything and so they would be ceding control of the user experience to a random number generator.

  • hylaride 19 hours ago

    The problem with Apple is that their corporate culture doesn't vibe well with a lot of new tech. Their culture of secrecy held them back with the original "smart assistants" as all the best talent were PhDs who wanted to publish, but apple wouldn't allow that; on top of that, they didn't pay top dollar like Google or Amazon did at the time, either (whom also let their employees publish on top of that!).

    Apple is being far too conservative with a far too fast a developing piece of technology to possibly keep up unless they loosen up. But they're being run by a bunch of 50+ year old white guys trying to still be cool, but not understanding what's really going on. I'm not saying they need to publish a roadmap or anything, but they need to tell their marketing dept. to piss off and that not everything needs to be a "delightful surprise" on stage.

    • JumpCrisscross 19 hours ago

      > problem with Apple is that their corporate culture doesn't vibe well with a lot of new tech

      Apple has never been the company that does it first. They're the company that does it right. Arguably, their fuckup with LLMs was rushing a garbage product to market instead of waiting and watching and perfecting in the background.

      > Apple is being far too conservative with a far too fast a developing piece of technology

      Strongly disagree. OpenAI and Anthropic are blowing billions on speculative attempts at advancing the frontier. They're advancing, but at great cost and uncertainty in respect of future returns.

      The smart move would be to recapitulate the deal with Google, possibly being paid by these cash-burning companies for the default AI slot on the iPhone, all the while watching what Apple's users do. Then, when the technology stabilises and the best models are known, Sherlocking the whole thing.

      • bitpush 19 hours ago

        People love to parrot this, yet if you think for a second this isnt true at all in all the many ways.

        1. Siri - not the first assistant, absolute garbage.

        2. Apple Maps (original) - utter garbage at launch, slightly better today in US.

        3. Vision Pro - Not the first VR headset. Massive failure.

        If anything, Apple has been tremendously successful few times when they were not first (phones, tablets, silicon ..) but they have also been tremendously faltered when they were not first.

        • mingus88 16 hours ago

          These first two bullets launched well over a decade ago.

          The third bullet is soft because ALL vr headsets have been flops.

          All told you are actually painting a pretty solid picture of apples track record. They’ve launched so many things in the past 20 years and expanded into new markets (wearables, headphones, streaming hardware and services) that it’s impressive there aren’t more flops

        • JumpCrisscross 19 hours ago

          Non sequitur. Nobody argued everything Apple has ever launched has been a success. I'm just pushing back on the notion that the "problem with Apple is that their corporate culture doesn't vibe well with a lot of new tech." That's not a problem, that's the key to their success (where it's been found).

        • barkerja 17 hours ago

          To be fair, Apple Maps is FAR better today than when it first launched.

          • bitpush 17 hours ago

            Yup, but limited in scope. And occasionally fails spectacularly.

  • leumon 19 hours ago

    They actually once used a neural net on-device in iOS 14, when they introduced the translate app. It worked offline, but actually sometimes produced some bad or hilarious translations.

  • hombre_fatal 19 hours ago

    Then how does Siri deliver on that philosophy? The UX of Siri is that it probably won't understand your request but it might, and only through trial and error do you realize the specific commands it might know. And even then, your queries can fail for unknown reasons.

    So I don't think this a likely explanation. Maybe they just wanted to have an in-house solution but realized they have no chance at delivering that value on their own. But it can't be about UX predictability because Siri has none unless you're setting a timer.

jonplackett 20 hours ago

I find this all quite baffling.

I’m pretty sure I could knock up a decent Siri clone with got4o-mini, because I already did a bootleg Alexa to power our smart home stuff after Amazon removed the privacy controls. The only hard bit was the wake word.

Siri is currently so terrible that even something like Mistral 8b could do a decent job. Whey don’t they just run something like that on their own servers instead?

  • martinald 20 hours ago

    Absolutely lost what is going on. Awni Hannun on Twitter works for Apple on the MLX team and is always completely up to date on all the local LLM models and their capabilities. They literally have a team of people building MLX for doing model inference on Apple Silicon.

    Does someone need to send someone an email to realise you don't need a huge frontier model to do basic tool calling like this?

    • anon373839 17 hours ago

      Not only that, but funneling every user query to Sam Altman isn’t exactly on-brand for Apple, also.

      • Nevermark 4 hours ago

        Well they have been funneling search to Google for years.

        Of course, Google pays them, but it’s still funneling info to a surveillance company vs. providing a private search.

        If Kagi can do it, Apple could.

  • int_19h 20 hours ago

    They are competing against Gemini on Android, so it stands to reason that they need something on par with that. Per TFA they are still talking about running those models on Apple's own servers.

    The more interesting question is how they're going to handle the 180 on all the talk about privacy.

    • tough 20 hours ago

      Claude can be used via amazon bedrock technically on your own managed AWS infra...

      But yeah you're not trusting Anthropic but Apple + Amazon

      I dunno if thats even a win?

    • halJordan 19 hours ago

      If the model runs on apple servers then the data isn't leaking anywhere. There's no 180 to discuss.

      • int_19h 19 hours ago

        There's a big difference between running on a local device and running on Apple's servers, and their previous stance was that most things would be in the former category. Switching to cloud (even if it's Apple's cloud) for regular Siri stuff would be a big 180.

    • JumpCrisscross 19 hours ago

      > They are competing against Gemini on Android, so it stands to reason that they need something on par with that

      Why? What does Gemini actually do, that users actually use, that requires deep integration into the OS?

    • WA 20 hours ago

      They could secretly make Siri slightly better in intervals. People have a low opinion of Siri anyways. No way they compare Siri to Gemini. For them, Siri might just stop sucking completely at some point and then the comparison is between the Siri of the last several weeks and the old Siri.

      • hylaride 20 hours ago

        > Siri might just stop sucking completely at some point and then the comparison is between the Siri of the last several weeks and the old Siri.

        The same thing did happen to Apple Maps, but many people still default to google (though google maps is still significantly better at finding businesses). But Apple was humiliated by the Apple Maps rollout. Siri has just been a slow-burning joke that's only really useful for setting a timer or reminder.

      • dabbz 12 hours ago

        There's very low margin of error for a user trying something over speech (where they don't know what's actually capable). A user tends to try something once, if it fails they nearly never try it again. So now the question is, how do you get a user to try it again when you've fixed it? Alexa's approach has just about driven everyone mad "By the way, did you know..."

  • singularity2001 10 hours ago

    while everyone agrees that Siri is crap saying you can knock out a decent clone yourself completely underestimate the complexities involved

drexlspivey 20 hours ago

There are rumors around that Apple is trying to buy perplexity which makes no sense to me.

Perplexity doesn’t have their own foundation model they just wrap existing models so what good are they? They should buy Mistral instead.

  • nickthegreek 18 hours ago

    I really like perplexity. It’s now my default search. there are ways to get perplexity pro for extremely cheap. I recommend those with a passing interest try it out.

    • barkerja 17 hours ago

      Perplexity is great. I’ve been a happy paying customer for a while.

  • halJordan 19 hours ago

    That rumor was a rumor about apple replacing its search engine. Perplexity has a home grown search engine.

    Regardless i dont accept the false constraint the Apple simply must buy a foundation model or whatever that means. Perplexity's "wrapper" does a better job than chatgpt or geminiin its domain.

    • drexlspivey 19 hours ago

      Apple needs a team that knows how to train an LLM. Why would they need to buy perplexity to use their search engine?

      • barkerja 18 hours ago

        To allow users to query for anything? The index isn't to train and build a model, it's to behave as a search engine for their users. It would (or could) effectively replace Google for Apple users.

        Imagine an updated Spotlight that would allow the user to enter any query, obtaining information from the internet, enriched with their local context/data.

        LLM Siri is an entirely different concern than Apple potentially acquiring Perplexity. I view them as two wildly different initiatives.

  • tough 20 hours ago

    I read on some other thread perplexity / android integration and app was somewhat decent. Might be the easiest way to acquire a team for Android dev?

Melatonic 15 hours ago

Downloaded Apple intelligence and realised I was probably never going to use it. Fully disabled. Siri has been disabled since day 1 (moved over from Android to the iPhone 15 pro max)

I would much rather see small individual uses of AI using the quite powerful hardware than another chatbot.

Photo editing for example - the new AI feature to remove an object (like a random person behind you in a selfie) works great. Give us more upgrades like that with real world uses ! I don't care about some giant all encompassing Siri. I don't even like talking to my phone in general.

jonplackett 20 hours ago

Maybe they should just start putting 16gb ram in iPhones from now on and make their local inference job so much easier.

  • drexlspivey 20 hours ago

    Too late, iphone 17 pro is already in production with 12 GB. They should have done it two years ago, it was such an obvious move.

  • raverbashing 20 hours ago

    I know, right

    I bet Apple put 16GB on their notebooks as default while grinding their teeth and cursing at the whole $5 of extra cost per unit

    • alwa 20 hours ago

      $5 of cost, $195 of foregone profit...

tzs 18 hours ago

I don't use Siri much, but I have noticed sometime over the last few months a problem in something that Siri uses. That's the voice dictation. I use it all the time on iPad to enter search terms.

So for instance if I wanted information on public transit options in London I'd tap the search bar in Safari, tap the mic icon, and say "Public transit options in London" and that used to work pretty much all the time. It would even work if I had a loud TV on or loud music on, and it was great about realizing when I'd stopped speaking and automatically starting the search.

Lately it has tended to cut off early, so I only get "Public transit options" entered. I can get it to work if I try again and say each word very loud and with a distinct short gap between the words.

My understanding is that modern dictation systems make heavy use of deep learning so I'd expect it shares from underlying technology with Siri. I wonder if there is a problem with that underlying technology?

daft_pink 8 hours ago

Hope they can address the fact that many devices we use for Siri are low powered and it’s not practical to put expensive chips in them, such as our Homepods.

I’m sure that has something to do with why something like OpenAI looks attractive. The situation where they would want to run own AI is to run locally on device.

bilsbie 19 hours ago

It’s getting kind of silly that we don’t have AI on phones in a usable way.

My wishlist:

Let me talk to AI about anything on my screen. Hey AI why did this guy email me? Hey AI what’s this webpage about? Etc

AI designs the UI on the fly depending on the task I’m doing. No more specific apps? just a fluid interface for whatever I need.

Leave AI in listening or video mode and ask about my environment or have a conversation.

  • minimaxir 19 hours ago

    The bottleneck for AI on the phones is more hardware/compute, which due to the development lifecycle always lags a bit and the 2 years since the LLM boom tracks for changes that would need to be met to match the moment (e.g. iPhones shipping with more RAM for Apple Intelligence).

brikym 13 hours ago

Siri is basically only good for checking the weather, starting a timer and basic maths.

It messes up half the tasks I ask of it:

Alarm rings: "Hey siri, stop" -> "Which alarm would you like me to delete?"

"Hey siri how far is from the bottom of [hiking trail] to the top?" "It'll take you x hours to walk from your location to the trail.".

"Hey siri call Jeff" [Calls different person]

It always keeps listening until I tell it to go away. Feel free to reply with your examples of not being taken Sirisly.

dham 15 hours ago

They don't need Anthropic or OpenAI. Literally just go to ollama.com and throw a dart at a random model. That will be better than whatever they are doing now.

blakesterz 20 hours ago

I would've held off on buying a new phone another year (AT LEAST) had I known all that Apple Intelligence hype was just hype.

  • cheeze 20 hours ago

    Did people actually think "we've added AI" was a selling point for the new flagships?

    Felt like the most obvious "CEO says we need to do this, doesn't matter if it isn't ready" kinda thing. Straight up checking a box for parity with Samsung et al.

adabyron 19 hours ago

IMO Apple's play here is to be the host that runs something like mcp servers and allows/encourages App devs to allow users to ask Siri to make requests that utilize their apps.

Then we can interact with multiple apps all via Siri and have them work together. To me that's a huge win.

  • barkerja 17 hours ago

    That’s essentially what app intents are.

    • adabyron 2 hours ago

      Thank you. I didn't know this had a name.

bilsbie 19 hours ago

I wish it could be on device though. I’d upgrade my phone for that.

BenGosub 17 hours ago

Why not use one of the open source models?

gdiamos 19 hours ago

Go with anthropic

UltraSane 13 hours ago

I truly do not understand how the same company that can create a truly innovative and incredible bit of hardware like the Vision Pro can also let Siri stagnate for almost its entire life.

  • bitpush 13 hours ago

    They painted themselves into a corner really. It started with privacy marketing and then they started believing it. Data = radioactive etc etc

    Siri can't get better without access to copious amounts of data; data apple doesn't have.

    They are fucked

guerrilla 19 hours ago

Is it not uncharactaristic that they're talking sbout this in public?

  • eviks 12 hours ago

    They aren't?

    > Representatives for Apple, Anthropic and OpenAI declined to comment.

    • guerrilla 7 hours ago

      I mean we know it's happening, so they definitely are.

      • eviks 7 hours ago

        No, that's not how talking in public is defined

        • guerrilla an hour ago

          Somone obviously talked. It's impossible for us to know this is happening otherwise. Normally we don't know what Apple does until they announce.

drivingmenuts 20 hours ago

And here's me trying to figure out what I would need AI on a phone. Apps are going to phone home and use their own AI, not Apple's. I don't need an AI to set a timer, search Google, or add to my calendar. If I write anything, I do it on my main machine.

Really wish this would be optional, but you know it won't be.

seydor 20 hours ago

I understand apple's strategy. If they had really good AI, their phones and watches would be reduced to a microphone and speaker. No more advantage. So they stick to crappy AI that forces users to tap on their phone frustratingly instead. Their idea about running openAI models is meant to make people disable AI features altogether. Brilliant strategy (/s)

  • mrtesthah 20 hours ago

    Oh wow, you just came up with a smartphone killer! How can I invest in your multibillion dollar idea?

    /s

rorylawless 13 hours ago

If Apple still had “courage”, they’d give up on AI and release a truly revolutionary “average phone”. A phone stripped of social media apps and features, with only access to music, mapping and messaging apps.

  • bitpush 13 hours ago

    Genuine question - who's asking for that? You doing think HN if representative of the larger iPhone user base do you?