Privacy, encryption vs. Surveillance state

Welcome to the Precious Metals Bug Forums

Welcome to the PMBug forums - a watering hole for folks interested in gold, silver, precious metals, sound money, investing, market and economic news, central bank monetary policies, politics and more. You can visit the forum page to see the list of forum nodes (categories/rooms) for topics.

Why not register an account and join the discussions? When you register an account and log in, you may enjoy additional benefits including no Google ads, market data/charts, access to trade/barter with the community and much more. Registering an account is free - you have nothing to lose!

The US Department of Homeland Security (DHS) is pushing hard for mandatory facial recognition scans at airports. The government wants to “remove a loophole” that is currently allowing Americans to opt-out of it right now.
Proposed in a recent filing, the DHS requested a change to the current rules in order to “provide that all travelers, including US citizens, may be required to be photographed upon entry and/or departure” from the US, citing the need to identify criminals or “suspected terrorists.” While not yet implemented, the rule change is in the “final stages of clearance,” a DHS official told CNN Business, according to a report by RT.
China is leading the pack. Vice President of the USA criticizes their work while US agencies continue laying groundwork for the same system. smh...

Vice President Mike Pence called out the program in a recent speech, warning that China’s surveillance state is “growing more expansive and intrusive — often with the help of U.S. technology.”

“By 2020, China’s rulers aim to implement an Orwellian system premised on controlling virtually every facet of human life — the so-called social credit score,” Mr. Pence said. “In the words of that program’s official blueprint, it will ‘allow the trustworthy to roam everywhere under heaven, while making it hard for the discredited to take a single step.’”

Facial recognition tools are widely used now in China bolstered by cameras deployed along streets, on buildings, in train stations, in classrooms and subway lines. With the emergence of next-generation 5G telecommunications technology, the reach of the surveillance networks is only expected to increase.

As part of the stepped-up surveillance, the Chinese government announced this month that all who purchase SIM cards for mobile phones must first produce a facial recognition print.
Beijing euphemistically calls the program part of “social management” — a key element of communist ideology to shape and control society.

In reality, critics say, the system is designed to preserve the power of the Communist Party of China, blacklisting and punishing anyone who is spotted by the system engaging in any unapproved activities. It’s marks a high-tech upgrade of traditional measures of control.

In the past, the party relied on a system called “dongan” or personal file — millions of dossiers on citizens filled with personal information ranging from comments made in high school to remarks made to coworkers.

The SCS is expected to take the dongan system to new levels of surveillance by the use of use of advanced technology.

“At its core, the system is a tool to control individuals,’ companies’ and other entities’ behavior to conform with the policies, directions and will of the [Communist Party of China],” said Samantha Hoffman, a China specialist with the Australian Strategic Policy Institute in Canberra. “It combines big-data analytic techniques with pervasive data collection to achieve that purpose.”

More (long):
Last edited:
The mobile phone industry has explored the creation of a global data-sharing system that could track individuals around the world, as part of an effort to curb the spread of Covid-19.

The Guardian has learned that a senior official at GSMA, an international standard-setting body for the mobile phone industry, held discussions with at least one company that is capable of tracking individuals globally through their mobile devices, and discussed the possible creation of a global data-sharing system.

Edward Snowden on Monday warned that high-tech surveillance measures governments use to fight the outbreak of COVID-19, the disease caused by the newly identified coronavirus, could have a long-lasting impact.

Of the funding allocated to the CDC, the bill sets aside at least $500 million for public health data surveillance and modernizing the analytics infrastructure. The CDC must report on the development of a "surveillance and data collection system" within the next 30 days. While it's not clear what form that surveillance system will take, the federal government has reportedly expressed interest in aggregating data that can be gleaned from tech platforms and smartphone use to monitor movement patterns.
Not related to government, just another self inflicted wound society embraces without any real cost analysis...
Maryland-based Rekor Systems Inc. has started offering home video surveillance software through a service called Watchman, starting at $5 a month. In addition to reading a license plate, the system can record a vehicle’s make, color, and body type. In October, Rekor will launch what it says is a “first of its kind” mobile app, which will let users scan license plates with their phone camera. The app could come in handy for schools, to “securely identify valid visitors for student pickup lines” or to manage cars in parking lots, among other uses, the company says.

The rise of more casual and cheap surveillance is putting tools once primarily used by law enforcement into the hands of virtually anyone. Privacy advocates worry about how personal information will be stored, shared, and used in the absence of clear legal protections.

Concerns about government using mass surveillance to track movements will be deflected the same way phone and internet surveillance was - by outsourcing to private companies.
Privately run car parks in the UK seem to have been using this for a while.
Could just be regular video footage and someone entering the registration numbers into a database but probably recognition software ?
Sometimes for the entry barrier to lift, you have to be in the vehicle that you prepaid for.
So today I fired up the Bugsite, clicked onto news and scrolled through the articles available.
I wondered if there was any commentary regarding the Tucker Carlson interview with Bobulinski .....
hmmm no.
Ok so I select another hedge article, read it, click on ( hedge) home and at the top ie the most viewed / trending article is this one -
So I read it, then come back here and refresh the news page and a new article shows up about blowing up deliveries of missiles to Iran but still no link to the Carlson - Bobulinski interview. Back to the Hedge and refresh and yes there is the Iran article at no2.

So the question is -

Who is blocking the Zero Hedge article from sites like this one and other alt sites that link Zero Hedge ?
I'll have to investigate (not convenient from my phone), but my guess is ZH doesn't publish everything in their RSS feed. I'm guessing the page in question was authored by a 3rd party (not one of the Tylers).
Oh... Pmbug is supposed to cache the RSS feed so it's not constantly pinging ZH (et al) all the time. I'm pretty sure I set the cache refresh period to an hour.
Thanks Bug
still curious though as it was certainly more than an hour. Can the time of first posting ( or even first comment ) be compared with my observations ?
ZH might also have a cache interval affecting how often it updates their RSS feed. For example, if their feed updates every 15 or 30 minutes, you can add that to the Bug cache interval to get a max possible cache delay.
Back on topic...

Almost two years ago, I ran across this news story (but failed to post it here) about Palantir Technologies building a supercomputer for the IRS to analyze/track financial transactions:
The Internal Revenue Service (IRS) is building a $99-million supercomputer that will give the agency the “unprecedented ability to track the lives and transactions of tens of millions of American citizens,” tax expert Daniel Pilla reports.

The IRS is already dangerous enough, notes Pilla. “The IRS lays claim to your data without court authority more so than any other government agency. And to make matters worse, they share the data with any other federal, state or local government agency claiming an interest, including foreign governments.”
... the agency is investing $99 million in a contract with Palantir Technologies of Palo Alto, California, to provide hardware, software, and training to “capture, curate, store, search, share, transfer, perform deconfliction, analyze and visualize large amounts of disparate structured and unstructured data.”

Specifically, Palantir is tasked with building and training IRS employees to use a supercomputer to “search, analyze, visualize, and interact with a wide variety of disparate data sets so users will be able to leverage the platform to perform advanced analytics, such as link, pattern, statistical, behavioral, and geospatial analysis on an investigative platform that is scalable and interoperable with existing IRS equipment and systems.”

Some background on Palantir Technologies from that time:
Everything about Palantir is unique. Founded in 2004 by a group of ex-Stanford students including Karp, Joe Lonsdale and PayPal co-founder Peter Thiel, it's the most valuable venture-backed start-up focused on selling to enterprises.

Palantir is notorious for its secrecy, and for good reason. Its software allows customers to make sense of massive amounts of sensitive data to enable fraud detection, data security, rapid health care delivery and catastrophe response.

Government agencies are big buyers of the technology. The FBI, CIA, Department of Defense and IRS have all been customers. Between 30 and 50 percent of Palantir's business is tied to the public sector, according to people familiar with its finances. In-Q-Tel, the CIA's venture arm, was an early investor.

Annual revenue topped $1.5 billion in 2015, sources say, meaning Palantir is bigger than top publicly traded cloud software companies like Workday and ServiceNow. It has about 1,800 employees and is growing headcount 30 percent annually, said the sources, who asked not to be named because the numbers are private.

This morning, I saw this report with a rather chilling quote in it:
Palantir Technologies Inc.’s business is increasing steadily, helped by more government and corporate contracts in part because of the coronavirus pandemic, it said as it reported earnings Thursday for the first time since going public.

The data-analytics company posted a quarterly loss of nearly $900 million that was mostly because of stock-based compensation. The tone of its first earnings call was upbeat and the company raised its full-year revenue outlook to a range of $1.07 billion to $1.072 billion, up 44% year over year.

The pandemic has “created enormous opportunities for us,” said Shyam Sankar, Palantir’s chief operating officer, on the company’s earnings call. The company is helping the government track clinical data and has been tapped to help with vaccine distribution, too. Besides the coronavirus, though, Sankar said he foresees a “large, systemic transformation in health care” that could benefit Palantir PLTR, 4.25%.

Sounds like they are building out surveillance tech for contact tracing / medical profiling or similar.
* bump *

NCLA Files Class-Action Against Massachusetts for Auto-Installing Covid Spyware on 1 Million Phones​

Nov 15, 2022

Washington, DC (November 15, 2022) – The Massachusetts Department of Public Health (DPH) worked with Google to auto-install spyware on the smartphones of more than one million Commonwealth residents, without their knowledge or consent, in a misguided effort to combat Covid-19. Such brazen disregard for civil liberties violates the United States and Massachusetts Constitutions and cannot stand. The New Civil Liberties Alliance, a nonpartisan, nonprofit civil rights group, has filed a class-action lawsuit, Wright v. Massachusetts Department of Public Health, et al., challenging DPH’s covert installation of a Covid tracing app that tracks and records the movement and personal contacts of Android mobile device users without owners’ permission or awareness.

Plaintiffs Robert Wright and Johnny Kula own and use Android mobile devices and live or work in Massachusetts. Since June 15, 2021, DPH has worked with Google to secretly install the app onto over one million Android mobile devices located in Massachusetts without obtaining any search warrants, in violation of the device owners’ constitutional and common-law rights to privacy and property. Plaintiffs have constitutionally protected liberty interests in not having their whereabouts and contacts surveilled, recorded, and broadcasted, and in preventing unauthorized and unconsented access to their personal smartphones by government agencies.

Once “automatically installed,” DPH’s contact tracing app does not appear alongside other apps on the Android device’s home screen. The app can be found only by opening “settings” and using the “view all apps” feature. Thus, the typical device owner remains unaware of its presence. DPH apparently decided to secretly install the contact tracing app onto over one million Android devices because few Massachusetts citizens were downloading its initial version, which required voluntary adoption. DPH decided to mass-install the app without device owners’ knowledge or consent. When smartphone owners delete the app, DPH simply re-installs it. Plaintiffs’ class-action lawsuit contains nine counts against DPH, including violations of their Fourth and Fifth Amendment rights under the U.S. Constitution, and violations of Articles X and XIV of the Massachusetts Declaration of Rights.

No statutory authority supports DPH’s conduct, which serves no public health purpose, especially since Massachusetts has ended its statewide contact-tracing program. No law or regulation authorizes DPH to secretly install any type of software—let alone what amounts to spyware designed specifically to obtain private location and health information—onto the Android devices of Massachusetts residents. The U.S. District Court for the District of Massachusetts should grant injunctive relief, along with nominal damages, to the class. NCLA is unaware at this time of other states that engaged in a similar surreptitious strategy of auto-installing contact-tracing apps. It appears Massachusetts iPhone users had to consent before a similar app installed on their devices.

NCLA released the following statements:

“Many states and foreign countries have successfully deployed contact tracing apps by obtaining the consent of their citizens before downloading software onto their smartphones. Persuading the public to voluntarily adopt such apps may be difficult, but it is also necessary in a free society. The government may not secretly install surveillance devices on your personal property without a warrant—even for a laudable purpose. For the same reason, it may not install surveillance software on your smartphone without your awareness and permission.”

— Sheng Li, Litigation Counsel, NCLA

“The Massachusetts DPH, like any other government actor, is bound by state and federal constitutional and legal constraints on its conduct. This ‘android attack,’ deliberately designed to override the constitutional and legal rights of citizens to be free from government intrusions upon their privacy without their consent, reads like dystopian science fiction—and must be swiftly invalidated by the court.”

— Peggy Little, Senior Litigation Counsel, NCLA
For more information visit the case page


Global Spyware Scandal: Exposing Pegasus Part One (full documentary) | FRONTLINE​


Premiered 9 hours ago


Part one of a two-part docuseries: FRONTLINE and Forbidden Films investigate Pegasus, a powerful spyware sold to governments around the world by the Israeli company NSO Group.

This journalism is made possible by viewers like you. Support your local PBS station here:

In 2020, the journalism nonprofit Forbidden Stories and Amnesty International gained access to a leaked list of more than 50,000 phone numbers. They suspected it contained numbers selected for potential surveillance with Pegasus. The Pegasus Project reporting consortium — which was led by Forbidden Stories and included 16 other media organizations, FRONTLINE among them — found that the spyware had been used on journalists, human rights activists, the wife and fiancée of the murdered Saudi columnist Jamal Khashoggi, and others.

Over two nights, this docuseries reveals the inside story of an investigation that prompted probes by governments and institutions around the world and sparked calls for an international treaty to govern the largely unregulated spyware industry.

NSO, which has disputed some of the Pegasus Project’s reporting, says that its technology was not associated in any way with Khashoggi’s murder and that it sells Pegasus to vetted governments for “the sole purpose of preventing and investigating terror and serious crime.”

Surveillance technologies like Pegasus are “a military weapon used against civilians, and the civilians, they don’t have any mechanism to help them in seeking justice,” says Laurent Richard, founder of Forbidden Stories and Forbidden Films and one of the producers of the films.

Part two of “Global Spyware Scandal: Exposing Pegasus” premieres Tues., Jan. 10, 2023.

Global Spyware Scandal: Exposing Pegasus Part Two (full documentary) | FRONTLINE​

Jan 10, 2023

Part two of a two-part docuseries: FRONTLINE and Forbidden Films investigate Pegasus, a powerful spyware sold to governments around the world by the Israeli company NSO Group. 53:17
... By December 2022, The Washington Post's Geoffrey Fowler noted "the Transportation Security Administration has been quietly testing controversial facial recognition technology for passenger screening at 16 major domestic airports."

Theoretically, travelers can opt out in favor of regular ID checks. But anybody who flies much knows how well things often go when you stand on your rights with the TSA—it's a great way to end up in a back room. Just weeks after writing up the rollout, Fowler told PBS: "since my column came out, readers said they followed that, went up to the podium and got pushback" when they objected to the facial scan.
... Just a few years ago, facial recognition often faltered when people masked their faces, such as (to little apparent public-health benefit) during the pandemic. "Even the best of the 89 commercial facial recognition algorithms tested had error rates between 5% and 50% in matching digitally applied face masks with photos of the same person without a mask," the U.S. National Institute of Standards and Technology (NIST) found in 2020.

In new tests just months later, failure rates plunged as algorithms refocused on details of the eyes and nose that aren't covered by face masks. There's little reason to believe that algorithms can't be refined to distinguish people's identity through differences in facial features and skin color.

That said, highly reliable facial recognition only amplifies a lot of other concerns about the surveillance state. Improving Big Brother's competence just sticks us with a more robust Big Brother.
Facial recognition is an increasingly effective technology. But in government hands it's more effective at threatening our privacy and liberty than at offering any real benefit.

The Department of Homeland Security’s Inspector General has released a troubling new report detailing how federal agencies like Immigration and Customs Enforcement (ICE), Homeland Security Investigations (HSI), and the Secret Service have conducted surveillance using cell-site simulators (CSS) without proper authorization and in violation of the law. Specifically, the office of the Inspector General found that these agencies did not adhere to federal privacy policy governing the use of CSS and failed to obtain special orders required before using these types of surveillance devices.

While technically unconnected to each other, recent reports about the Secret Service and Immigration and Customs Enforcement playing fast and loose with rules regarding cellphone tracking and the FBI purchasing phone location data from commercial sources constitute an important wake-up call. They remind us that those handy mobile devices many people tote around are the most cost-effective surveillance system ever invented. We pay the bills for our own tracking beacons, delighted that in addition to tagging our whereabouts, they also let us check into social media and make the occasional voice call.


A "Miss My Face" hat will go a long way towatds avoiding the facial recognition bs.

It's a hat with built in ir leds that overwhelm the cameras with bright ir light. They even have ccd detectors built into them.

Might not be a perfect solution, but will certainly help.
Everybody’s favourite big tech giant, Amazon, is facing yet another class-action lawsuit, this time for allegedly deploying biometric recognition technologies to monitor Amazon Go customers in its New York City outlets without their knowledge. According to the lawsuit, Amazon violated a 2021 NYC law which mandates that all business establishments that track their customers’ biometric information, including retail stores, must at least inform their customers that they are doing so. Amazon apparently didn’t.

Show me the reference to that.
I don't think it directly says exactly that, but does say that anyone trying to defeat their restrictions and connect to so-called banned sites/services, could face up to 20 years in jail and a million dollar fine.

So lets say they get rid of tik tok or any other site/service the gov deems as dangerous, and you get caught using a vpn in order to surreptitiously connect to it, you would be subject to those penalties.
EFF being vigilant...

In comments to the National Institute of Standards and Technology (NIST), EPIC and the ACLU urged the standards-setting agency to update their draft guidelines to further reduce collection of biometric information and Social Security Numbers, evaluate the potential of W3C Verifiable Credentials, limit use of potentially harmful fraud prevention tools, and take stronger steps to advance equity. NIST’s Draft Guidelines are non-binding sets of standards and practices that federal agencies consult when designing identity verification systems. The current update in the Guidelines for the first time recommends that agencies provide options instead of biometric identity verification and advises agencies to address equity in system design.

The Electronic Privacy Information Center (EPIC) and the American Civil Liberties Union (ACLU) submit these comments in response to the National Institute of Standards and Technologies’ (NIST) draft Digital Identity Guidelines for Enrollment and Identity Proofing.[1] The updated guidelines provide technical standards for three levels of “identity assurance” to be used across the federal government and for the first time explicitly incorporates equity concerns.
As was laid bare during the COVID-19 pandemic, identity verification systems that fail to properly address equity concerns can create potentially insurmountable barriers to people accessing essential government services. During the height of the pandemic, state workforce agencies rapidly adopted’s identity verification system, purportedly NIST SP 800-63 IAL2 compliant, without providing for meaningful alternatives, requiring unemployment insurance applicants to upload government documents and snap selfies for facial recognition comparison or wait in hours-long online queues for trusted referees when automated processes failed.[3] For the many people who are on the wrong side of the digital divide –disproportionately Black, Latinx, Indiginous people and those with disabilities and/or rural households – and who lacked access to smartphones with cameras, reliable internet service, or who simply were less familiar with how to use a complex technology, the adoption of resulted in an inability to access government benefits when they needed them the most.[4] Moreover, facial recognition technology generally has differential error rates by race and gender, further exacerbating the potential disparate impact of digital identity verification systems that employ it.

We urge NIST to modify the draft guidelines to 1) depreciate repeat, remote collections of biometric information, 2) remove the social security number as a valid attribute for identity verification and invest in alternatives 3) evaluate W3C Verifiable Credentials as a technical standard to improve remote identity verification, 4) target fraud prevention controls towards large-scale attacks and de-prioritize fraud prevention that creates barriers to claiming benefits, and 5) to further strengthen steps to address equity concerns by requiring agencies to provide multiple options for identity verification and other measures.

More (long) :


A new U.S. Senate bill would require private messaging services, social media companies, and even cloud providers to report their users to the Drug Enforcement Administration (DEA) if they find out about certain illegal drug sales. This would lead to inaccurate reports and turn messaging services into government informants.

The bill, named the Cooper Davis Act, is likely to result in a host of inaccurate reports and in companies sweeping up innocent conversations, including discussions about past drug use or treatment. While explicitly not required, it may also give internet companies incentive to conduct dragnet searches of private messages to find protected speech that is merely indicative of illegal behavior.

Most troubling, this bill is a template for legislators to try to force internet companies to report their users to law enforcement for other unfavorable conduct or speech. This bill aims to cut down on the illegal sales of fentanyl, methamphetamine, and counterfeit narcotics. But what would prevent the next bill from targeting marijuana or the sale or purchase of abortion pills, if a new administration deemed those drugs unsafe or illegal for purely political reasons? As we've argued many times before, once the framework exists, it could easily be expanded.

Bad ideas never die in D.C.
Bad ideas never die in D.C.
Which is merely a symptom of the contempt they hold towards our nations Founding Documents and the principles they contain.

If they held them in high esteem, they'd never even consider legislation like that.
The UK has entered the chat:
... The United Kingdom has been debating the “Online Safety Bill” that could have serious consequences for many internet companies. This sizable piece of legislation would create many new requirements for platforms that carry user generated content including search engines, messaging apps, and social media. Among these requirements include strict age‐verification requirements and limiting certain types of “legal but harmful” content that raise significant concerns for the implications the bill would have on users’ privacy and speech. While the regulations will mostly be felt by U.K. internet users, the global nature of the internet means that the regulations will likely impact users more generally. With that in mind, here are three key reasons Americans should be concerned about what might happen if the United Kingdom passes the Online Safety Bill.


Eventually there will be a code assigned to every financial transaction that describes what it is to AI. The IRS, FBI and others will receive messages when red flags pop up or will have a massive data base for research into an individual's behavior.

The key is what will define red flags? I suppose if one is on the opposite side of a political agenda that buying an extra soda in a vending machine using a bank card could lead to some agency taking your kids away the way things are going.
Top Bottom