Page 1 of 2 12 LastLast
Results 1 to 20 of 27

Thread: Iphone forced update coming

  1. #1
    Join Date
    Sep 2001
    Location
    Wateree, South Carolina
    Posts
    48,810

    Default Iphone forced update coming

    If you don't want the freaks in Cupertino seeing the images on your phone and cloud, you may want to clear them. Of course these abortion and kiddie sex lovers are using child pornography as the excuse, but we can guess where this spytech is going for anyone who doesn't support the Great Reset...

    Apple will scan photos stored on iPhones and iCloud for child abuse imagery

    The feature will roll out in the US first

    By Jay Peters
    Aug 5, 2021

    Apple plans to scan photos stored on iPhones and iCloud for child abuse imagery, according the Financial Times. The new system could help law enforcement in criminal investigations but may open the door to increased legal and government demands for user data.

    The system, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will roll out first in the US. Photos will be hashed and compared with a database of known images of child sexual abuse.

    “According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

    John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”
    https://www.theverge.com/2021/8/5/22...uralmatch?ICID

    https://www.ft.com/content/14440f81-...2-a81458f5411f

  2. #2
    Join Date
    Jan 2005
    Location
    FROG LEVEL
    Posts
    23,785

    Default

    Just more dogshit of invasion of privacy and getting you programmed. Fuck Um
    Gettin old is for pussies! AND MY NEW TRUE people say like Capt. Tom >>>>>>>>>/
    "Wow, often imitated but never duplicated. No one can do it like the master. My hat is off to you DRDUCK!"

  3. #3
    Join Date
    Apr 2002
    Location
    upstate
    Posts
    9,696

    Default

    They are just announcing it.

    They’ve already seen em.

    Every image on the inter web is scanned against known victims. More added daily.
    A vote is like a rifle: its usefulness depends upon the character of the user.

    Theodore Roosevelt; 26th president of US (1858 - 1919)
    ____________________________________________

    “A fear of weapons is a sign of retarded sexual and emotional maturity” Sigmund Freud

  4. #4
    Join Date
    Jan 2004
    Posts
    15,351

    Default

    Time to add some photos of AR lowers with the third hole as well as a couple glocks with a fun switch. That'll get someone's attention, might get a visit from one off the alphabet agencies.
    Amendment II A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.

    Quote Originally Posted by Highstrung View Post
    I like fishing topwater. Will one of you jot down some of this redneck ghetto slang and the definitions for those of us who weren't born with a plastic spoon in our mouths?

  5. #5
    Join Date
    Mar 2002
    Location
    GVL
    Posts
    4,363

    Default

    Poor Jennifer Lawrence.
    At least I'm housebroken.

  6. #6
    Join Date
    Jan 2004
    Location
    Campobello
    Posts
    3,033

    Default

    If that’s really going to happen, everyone with an iOS platform device should screenshot instances of leftist hypocrisy, too. Lots of legitimate MSM news headlines that include Epstein, the Clintons, Bill Gates, etc., and underage victims of sexual exploitation. Load it up.

  7. #7
    Join Date
    Dec 2006
    Location
    May River
    Posts
    7,338

    Default

    Droid...
    you aint did a dawg gon thang until ya STAND UP IN IT!- Theodis Ealey


    Quote Originally Posted by Rebel Yell View Post
    The older I get, the more anal retentive I get.

  8. #8
    Join Date
    Dec 2009
    Location
    Hampton Co./Bluffton
    Posts
    7,817

    Default

    And people think the vaccine is tracking them...
    Quote Originally Posted by Chessbay View Post
    Literally translated to, "I smell like Scotch and Kodiak".
    "Let us cross over the river, and rest under the shade of the trees"- Gen. Thomas "Stonewall" Jackson

  9. #9
    Join Date
    Dec 2011
    Location
    Rock Hill SC
    Posts
    9,154

    Default

    Put a few suppressor pics in there

  10. #10
    Join Date
    Jan 2009
    Location
    Murrells Inlet
    Posts
    2,302

    Default

    Some oil filters too.

    Doesn't suprise me at all.

    It's for your safety, remember......

  11. #11
    Join Date
    Mar 2002
    Location
    'Down in the Holler', SC
    Posts
    14,555

    Default

    It would be great if someone would do some Photoshop work and put all of the tech officials and Biden Cabal in diapers and post the pictures for everyone to download. And add some cute head pieces and rattles while they’re at it.

    Any takers?
    .
    Foothills Golden Retriever Rescue
    .
    "Keep your powder dry, Boys!"
    ~ George Washington

    "If I understood everything I said I'd be a genius." ~ 'Unknown'

  12. #12
    Join Date
    Jul 2003
    Location
    Moncks Corner
    Posts
    15,556

    Default

    Let's face it this is an attempt to collect blackmail material and harvest teenaged girls bathroom selfies for these perverts. As I understand it, it's illegal to view child porn whether you work for Apple or not. Start changing those SOBs too.
    Ephesians 2 : 8-9



    Charles Barkley: Nobody doesn't like meat.

  13. #13
    Join Date
    Nov 2007
    Location
    Lexington, SC
    Posts
    14,522

    Default

    Quote Originally Posted by Chuck the Duck Slayer View Post
    Time to add some photos of AR lowers with the third hole as well as a couple glocks with a fun switch. That'll get someone's attention, might get a visit from one off the alphabet agencies.
    Kinda like these ?
    C6B034F4-A729-48D6-B9EA-B1580204238F.jpeg
    C2553A26-87D4-4491-AB66-2CCCB4DBB67B.jpeg
    E61322F6-7528-45CE-9201-42FA7A8D7C3F.jpeg
    A134760D-944A-41E8-81FD-F00AD4110265.jpeg
    6E8528F8-667C-498C-8048-DF705F18BEB8_1_201_a.jpg
    and an AR-10 with the PEW PEW hole.
    D9DF3451-6E0F-41BE-AA89-014A1652FBBF.jpg
    Quote Originally Posted by ecu1984 View Post
    Steelin' Ducks is the KRT of suppressors and such.

  14. #14
    Join Date
    Jan 2005
    Location
    FROG LEVEL
    Posts
    23,785

    Default

    I like that
    Gettin old is for pussies! AND MY NEW TRUE people say like Capt. Tom >>>>>>>>>/
    "Wow, often imitated but never duplicated. No one can do it like the master. My hat is off to you DRDUCK!"

  15. #15
    Join Date
    Sep 2001
    Location
    Wateree, South Carolina
    Posts
    48,810

    Default

    Apple wants to check your phone for child abuse images – what could possibly go wrong?

    Arwa Mahdawi

    On the surface Apple’s new features sound both sensible and commendable – but they also open a Pandora’s box of privacy and surveillance issues

    Sat 7 Aug 2021 09.00 EDT

    Apple, which has spent big bucks on ad campaigns boasting about how much it values its users privacy, is about to start poking through all your text messages and photos. Don’t worry, the tech company has assured everyone, the prying is for purely benevolent purposes. On Thursday Apple announced a new set of “protection for children” features that will look through US iPhones for images of child abuse. One of these features is a tool called neuralMatch, which will scan photo libraries to see if they contain anything that matches a database of known child abuse imagery. Another feature, which parents can enable or disable, scans iMessage images sent or received by accounts that belong to a minor. It will then notify the parents when a child receives sexually explicit imagery.

    On the surface Apple’s new features sound both sensible and commendable. Technology-facilitated child sexual exploitation is an enormous problem; one that’s spiralling out of control. In 1998 there were more than 3,000 reports of child sex abuse imagery, according to a 2019 paper published in conjunction with the National Center for Missing and Exploited Children. In 2018 there were 18.4m. These reports included more than 45m images and videos that were flagged as child sexual abuse. Technology companies have a duty to curb the terrible abuses their platforms help facilitate. Apple’s new features are an attempt to do just that.

    But while Apple’s attempts to protect children may be valiant, they also open a Pandora’s box of privacy and surveillance issues. Of particular concern to security researchers and privacy activists is the fact that this new feature doesn’t just look at images stored on the cloud; it scans users’ devices without their consent. Essentially that means there’s now a sort of “backdoor” into an individual’s iPhone, one which has the potential to grow wider and wider. The Electronic Frontier Foundation (EFF), an online civil liberties advocacy group, warns that “all it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content … That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.” You can imagine, for example, how certain countries might pressure Apple to scan for anti-government messages or LGBTQ content.

    Jillian York, the author of a new book about how surveillance capitalism affects free speech, is also concerned that Apple’s new parental controls mean images shared between two minors could be non-consensually shared with one of their parents. “This strikes me as assumptive of two things,” she told me. “One, That adults can be trusted with these images and two, that every other culture has the same ideas about what constitutes nudity and sexuality as the US does.”

    Edward Snowden, who knows a thing or two about abuses of surveillance, has also voiced concerns about Apple’s new features. “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this,” Snowden tweeted. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs–*without asking.*”

    But why would a technology company bother asking the public what it wants? We all know that big tech knows what’s best for us plebs. While mass surveillance may sound scary, I’m sure we can all trust Apple et al. to do the right thing. No need to worry about hackers or Apple contractors accessing and uploading your nudes! No need to worry about Apple employees exploiting the technology to spy on people, in the same way that Uber employees did with their “God View” tool! I’m sure it will all be perfectly fine.

    https://www.theguardian.com/commenti...-apple-privacy

  16. #16
    Join Date
    Sep 2001
    Location
    Wateree, South Carolina
    Posts
    48,810

    Default

    Apple Child Safety photo scanning: what you need to know

    By David Lumb about 18 hours ago

    Apple announced that it would be enacting a new protocol: automatically scanning iPhones and iPads to check user photos for child sexual assault material (CSAM). The company is doing this to limit the spread of CSAM, but also adding other features ‘to protect children from predators who use communication tools to recruit and exploit them,’ Apple explained in a blog post. For now, the features will only be available in the US.

    Apple will institute a new feature in iOS 15 and iPadOS 15 (both expected to launch in the next couple months) that will automatically scan images on a user’s device to see if they match previously-identified CSAM content, which is identified by unique hashes (e.g. a set of numbers consistent between duplicate images, like a digital fingerprint).

    Checking hashes is a common method for detecting CSAM that website security company CloudFare instituted in 2019 and used by the anti-child sex trafficking nonprofit Thorn, the organization co-founded by Ashton Kutcher and Demi Moore.

    In addition, Apple has added two systems parents can optionally enable for children in their family network: first, on-device analysis in the Messages app that scans incoming and outgoing photos for material that might be sexually explicit, which will be blurred by default, and an optional setting can inform account-linked parents if the content is viewed.

    Apple is also enabling Siri and Search to surface helpful resources if a user asks about reporting CSAM; both will also intervene when users search queries relating to CSAM, informing the searcher of the material’s harmful potential and pointing toward resources to get help.

    That’s an overview of how, by Apple’s own description, it will integrate software to track CSAM and help protect children from predation by intervening when they receive (and send) potentially inappropriate photos. But the prospect of Apple automatically scanning your material has already raised concerns from tech experts and privacy advocates – we’ll dive into that below.

    If you do not have photos with CSAM on your iPhone or iPad, nothing will change for you.

    If you do not make a Siri inquiry or online search related to CSAM, nothing will change for you.

    If your iPhone or iPad’s account is set up with a family in iCloud and your device is designated as a child in that network, you will see warnings and blurred photos should you receive sexually explicit photos. If your device isn’t linked to a family network as belonging to a child, nothing will change for you.

    Lastly, your device won’t get any of these features if you don’t upgrade to iOS 15, iPadOS 15, or macOS Monterey. (The latter will presumably scan iCloud photos for CSAM, but it’s unclear if the Messages intervention for sexually explicit photos will also happen when macOS Monterey users use the app.)

    These updates are only coming to users in the US, and it’s unclear when (or if) they’ll be expanded elsewhere – but given Apple is positioning these as protective measures, we’d be surprised if they didn’t extend it to users in other countries.

    From a moral perspective, Apple is simply empowering parents to protect their children and perform a societal service by curbing CSAM. As the company stated in its blog post, “this program is ambitious, and protecting children is an important responsibility.”

    Apple has repeatedly championed the privacy features of its devices, and backs that up with measures like maximizing on-device analysis (rather than uploading data to company servers in the cloud) and secure end-to-end encrypted communications, as well as initiatives like App Tracking Transparency that debuted in iOS 14.5.

    But Apple has also been on the receiving end of plenty of lawsuits over the years that have seemingly pushed the company to greater privacy protections – for instance, a consumer rights advocate in the EU sued the tech giant in November 2020 over Apple’s practice of assigning each iPhone an Identifier for Advertisers (IDFA) to track users across apps, as reported by The Guardian.

    This may have nudged Apple to give consumers more control with App Tracking Transparency, or at least aligned with the company’s actions in progress.

    TechRadar couldn’t find a particular lawsuit that would have pressured Apple to institute these changes, but it’s entirely possible that the company is proactively protecting itself by giving younger users more self-protection tools as well as eliminating CSAM on its own iCloud servers and iPhones in general – all of which could conceivably limit Apple’s liability in the future.

    But if you can remove CSAM material, why wouldn’t you?

    Soon after Apple introduced its new initiatives, security experts and privacy advocates spoke up in alarm – not, of course, to defend using CSAM but out of concern for Apple’s methods in detecting it on user devices.

    The CSAM-scanning feature does not seem to be optional – it will almost certainly be included in iOS 15 by default, and once downloaded, inextricable from the operating system.

    From there, it automatically scans a user’s photos on their device before they’re uploaded to an iCloud account – if a certain amount of a photo matches those CSAM hashes during a scan, Apple manually reviews the flagged image and, if they determine it to be valid CSAM, the user’s account is shut down and their info is passed along to the National Center for Missing and Exploited Children (NCMEC), which collaborates with law enforcement.

    Apple is being very careful to keep user data encrypted and unreadable by company employees unless it breaches a threshold of similarity with known CSAM. And per Apple, “the threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”

    But it’s the automatic scanning that has privacy advocates up in arms. “A backdoor is a backdoor,” digital privacy nonprofit Electronic Frontier Foundation (EFF) wrote in its blog post responding to Apple’s initiative, reasoning that even adding this auto-scanning tech was opening the door to potentially broader abuses of access:

    “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.

    "That’s not a slippery slope; that’s a fully-built system just waiting for external pressure to make the slightest change,” the EFF wrote, pointing to laws passed in other countries that require platforms to scan user content, like India’s recent 2021 rules.

    Others in the tech industry have likewise pushed back against Apple’s auto-scanning initiative, including Will Cathcart, head of the Facebook-owned WhatsApp messaging service.

    In a Twitter thread, he pointed to WhatsApp’s practice of making it easier for users to flag CSAM, which he claimed led the service to report over 400,000 cases to NCMEC last year, “all without breaking encryption.”

    This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.August 6, 2021

    In fairness, Facebook has been trying to get around Apple's App Tracking Transparency: after being forced to disclose how much user data its mobile app (and WhatsApp's app) access, Facebook has tried prompting users to allow that access while criticizing Apple for App Tracking Transparency's harm to small businesses (and, presumably, Facebook) relying on that advertising income.

    Other tech experts are waiting for Apple to give more information before they fully side with the EFF’s view.

    “The EFF and other privacy advocates' concern around misuse by authoritarian regimes may be scarily on point or an overreaction - Apple needs to provide more implementation details,” Avi Greengart, founder of tech research and analysis firm Techsponential, told TechRadar via Twitter message.

    “However, as a parent, I do like the idea that iMessage will flag underage sexting before sending; anything that even temporarily slows the process down and gives kids a chance to think about consequences is a good thing.”


    https://www.techradar.com/news/apple...u-need-to-know

  17. #17
    Join Date
    Mar 2005
    Location
    Johnston
    Posts
    22,409

    Default

    Lol. Y’all really think they haven’t had access to anything that’s on your phone all along? Droids included. Don’t be naive.
    Quote Originally Posted by Mars Bluff View Post
    Only thing we need to be wearing in this country are ass whippings & condoms. That'll clear up half our issues.

  18. #18
    Join Date
    Nov 2010
    Location
    Summerville, SC
    Posts
    7,297

    Default

    Yup. It's happening everyday now. Yesterday I was shopping for shoes, went to several different stores. My phone was in my pocket but I never used it while in the store or out of the store to do any searches for shoes or shoe ratings, etc.

    An hour or 2 later I opened my FB page and the were 10+ ads for the exact same shoes I was looking at.
    Same thing has happened 3-4 times in the last few months.
    This also happened a few weeks ago when the wife and I was discussing social security disability income. ( very distinctly different that social security retirement income) and a few hours later links to SSDI articles started popping up on my FB page.

    Any expectation of privacy is long gone in today's tech world.

  19. #19
    Join Date
    Aug 2013
    Location
    GreenHood
    Posts
    13,833

    Default

    Quote Originally Posted by FEETDOWN View Post
    Lol. Y’all really think they haven’t had access to anything that’s on your phone all along? Droids included. Don’t be naive.
    Somebody is looking at my stuff and having one of these two reactions

    EFDC0B4C-8B83-4072-B501-6D38EAC39F3B.jpg
    Houndsmen are born, not made

    Quote Originally Posted by 2thDoc View Post
    I STAND WITH DUCK CUTTER!
    Quote Originally Posted by JABIII View Post
    I knew it wasn't real because no dogbox...

  20. #20
    Join Date
    Dec 2010
    Posts
    5,189

    Default

    Just wait til they start looking hard at the kids out hunting or shooting guns. Or Visor's boys taking a leak down by the pond. It'll all be in the name of "protecting" the children.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •