• Evil_Shrubbery@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 hours ago

    Wait, I just got my mom an S25 and this def isn’t on it (nor is it on my phone.

    … is it Play Store that installs it (starts the download and install process I mean, not actually “installs” it)?

    So if Play Store is disabled while Play services are still enabled you don’t get this app auto-shoved up your device?
    That’s nice.
    (Or is it a regional thing and EU isn’t on the shitlist yet?)

    • Evil_Shrubbery@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 hours ago

      Stride to always do the moist evil as it will usually correlate with profit & power. If faced with two equally evil options, select the shadier one. And always ask how to make it even more evil & shadier with each iteration.
      - your generic* MegaCorp

      *but all of them

  • RizzoTheSmall@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    8 hours ago

    I feel like clicking that link would make it super evident that you have ‘sensitive content’ to hide.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      5 hours ago

      I already removed it a couple of weeks ago, because:

      • Get the fuck out of my phone you trash panda of a software company.
      • You’re not wasting my battery power, my network, and my cellular data, just to fulfill some spy crap.
      • Don’t change things on my phone without my consent, same reason I left Windows, odd parallels there too…without consent…
      • But mostly, this tech isn’t just about detecting bad photos. This tech can be targeted on any kind of photo.

      The same issues popped up when Apple said they were going to put such scanners in iOS and received blowback (as to if they did or not though, I have a screenshot I’m trying to find of an image scanner running sometime late last year to do some deep diving, but I’ve not found it yet.)

      It is basically the classic government overreach trope done by a corporation. “For the children!

      Steps:

      • Find an excuse to put something bad in machines under the guise of some protected class everyone agrees should be protected.
      • Wait until it is normalized and people don’t think about it anymore.
      • Reconfigure it to hunt for what you want on demand.
        • Want to find out who all has been to a recent protest? Configure the image scanner to return pings from protest signs.
        • Looking for people that have been around a particular “person of interest”? Send out a query to receive pings back where that person’s face was detected in their photos.
        • Looking for people of a certain race in a certain geolocation? Send out a query for certain skin colors in recent photos and wait for those pings back, with GPS geotags, of course.

      They don’t even need the photo, just “evidence” to start going after people and “detaining” them.

      This. Shit. Is. Pure. Evil.

      • Evil_Shrubbery@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 hours ago

        Thx and yes to all … but pls don’t use such cute names for the evilest things out there. Raccoons are way too cute to be mentioned in the eldritch-type evil category with Alphabet.

    • x00z@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      This is why it’s important to click on everything.

      If you looked up instructions on how to make a bomb, but also on how to defuse a bomb, they will never know what your real motive is.

    • DerisionConsulting@lemmy.ca
      link
      fedilink
      English
      arrow-up
      20
      ·
      8 hours ago

      It detects nudes, and a lot of people will have nudes on their devices.

      The steps from the article:

      So, if you wish to uninstall or disable SafetyCore, take these steps:

      • Open Settings: Go to your device’s Settings app
      • Access Apps: Tap on ‘Apps’ or ‘Apps & Notifications’
      • Show System Apps: Select ‘See all apps’ and then tap on the three-dot menu in the top-right corner to choose ‘Show system apps’
      • Locate SafetyCore: Scroll through the list or search for ‘SafetyCore’ to find the app
      • Uninstall or Disable: Tap on Android System SafetyCore, then select ‘Uninstall’ if available. If the uninstall option is grayed out, you may only be able to disable it
      • Manage Permissions: If you choose not to uninstall the service, you can also check and try to revoke any SafetyCore permissions, especially internet access

      However, some have reported that SafetyCore reinstalled itself during system updates or through Google Play Services, even after uninstalling the service. If this happens, you’ll need to uninstall SafetyCore again, which is annoying.

      • sunzu2@thebrainbin.org
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        7 hours ago

        SafetyCore reinstalled itself during system updates or through Google Play Services

        Big tech is essentially ran by rapists

  • morrowind@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    9
    ·
    5 hours ago

    All of this happens on-device to protect your privacy and keep end-to-end encrypted message content private to only sender and recipient. Sensitive Content Warnings doesn’t allow Google access to the contents of your images, nor does Google know that nudity may have been detected. This feature is opt-in for adults, managed via Android Settings, and is opt-out for users under 18 years of age. Sensitive Content Warnings will be rolling out to Android 9+ devices including Android Go devices3 with Google Messages in the coming months.

    Y’all need to chill some times

    • Luffy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      4 hours ago

      If I told you that I will give you free candy if you get into the back of my white van, would you do it?

      The software is closed source, and Google has got a bad record of privacy. You really think that they won’t do it another time?

      • morrowind@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        3 hours ago

        If you’re trying to think through it, you’ve got to think about incentives. Google is in the ads business. They have no reason to want your nudes. It doesn’t help them sell anything to you

        • Evil_Shrubbery@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 hours ago

          You might not understand how Google handles data (of unique users or other internet info). Their core businesses model is based on exactly that - they read & record your private data & model ads based on it.

          They won’t just scan your nudes, but all pics (not that nudes don’t contain useful data for Google).

          Maybe I’ll pay Google money to show ads for my capsaicin-based talcum alternative only to people with smooth buttholes. And as I do so (for a modest amount of money per potential client) Amazon will offer you specifically my Chilly lube.

          (And the “local” thing/claim is ofc bs, it can just mean they use your CPU time instead of theirs, but still get the same computed data at the end.)

      • TrickDacy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 hours ago

        Well by that logic using anything Google has ever touched would be insanely out of the question, rendering this article and thread a moot point. I get that this feature sounds sketchy but at some point paranoia should lead you to another platform rather than speculating about something just because it was brought to your attention.