Privacy

1 readers
0 users here now

Privacy & Freedom in the Information Age

Privacy in the digital age.

founded 1 year ago
MODERATORS
1
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/Adoptedperson123 on 2023-07-07 07:00:57+00:00.


My child “accidentally” viewed porn on my work pc at home on a different wifi, I work for a big company so it would be a waste of time going through my search history right? I just want to know what the chance of getting fired of is and should I be worried.

2
1
Reddit Delete from Archive (lemmy.beyondcombustion.net)
submitted 1 year ago by [email protected] to c/[email protected]
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/monkeywork on 2023-07-07 06:19:11+00:00.


I know most of the tools using 3rd party apps are down or undergoing some changes but is there any scripts I can run locally using my login that can read over my Reddit data backup (that they finally delivered AFTER API changes) and delete the posts / comments / etc?

3
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/shekar_h on 2023-07-07 05:01:47+00:00.


Large Language Models (LLMs), like GPT-4 by OpenAI, have various applications, spanning from interactive public models to privately hosted instances for businesses. Each application brings forth its unique data protection and privacy compliance concerns.

This thread explores various scenarios related to data protection considerations that companies are exploring. If you are aware of anything besides what I have listed below, then feel to respond to this thread:

1. Using Public LLMs

Application: Public models, such as ChatGPT, are used in various contexts due to their versatile capabilities.

Example: An individual might use ChatGPT online to ask general questions or gather information on a topic.

Data Protection Consideration: When interacting with public models, the data shared might be exposed to third parties. Employees might inadvertently share sensitive data, which can significantly impact the brand and business. Privacy compliance could be at risk if personal or proprietary information is shared. Users must exercise caution to mitigate this risk.

2. Hosting Private Instances

Application: Businesses may host private instances of LLMs for internal use, such as managing corporate knowledge.

Example: A company may use a privately hosted LLM to automate responses to frequently asked internal questions about compliance policies and procedures.

Data Protection Consideration: Hosting LLMs privately reduces the risk of external data leaks.

3. Fine-tuning Public Models

Application: Fine-tuning a public model for a specific task, like customer support.

Example: An organization may fine-tune ChatGPT on its product-specific data to provide automated customer support.

Data Protection Consideration: While the risk of data leakage to the outside is relatively low, data might be exposed inadvertently during the model's interaction with internal users. Exposing customer information, salary, or sensitive business data can lead to serious issues. Therefore, businesses must establish strict data management practices and privacy compliance protocols during fine-tuning and deployment.

4. Using Applications that Employ LLMs

Application: Tools or platforms that use LLMs for tasks

Example: An app that uses an LLM to help users write essays or reports.

Data Protection Consideration: The risk of data leakage varies depending on whether the application uses public, private, or fine-tuned LLMs. As a general rule, assuming a high level of risk is advisable. Applications must implement stringent data privacy measures and ensure robust security practices to uphold privacy norms.

In summary, navigating the data protection and privacy compliance concerns that come with the versatility of LLMs is crucial. Whether an organization uses public models, hosts private instances, fine-tuning models, or employs LLM-powered applications, robust data management strategies and strict compliance protocols are essential. That said, managing these complexities can be challenging. Hence, to help organizations leverage LLMs more securely and responsibly.

4
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/saturatedlightning on 2023-07-07 04:55:55+00:00.


Today after being logged in for a long while in my computer I wasn't able to access to my wifi after logging off it and later trying to add the password.

Tried over three passwords I had and none of them worked, even though the router name and wifi were the same. How likely it is that my router got hacked?

5
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/geekgeek2019 on 2023-07-07 02:49:37+00:00.


Hello I wanted to ask how bad is it to use Instagram for just content viewing like memes and reels? I have few friends from high school on there and 100s of random meme or content (career related) that I follow. I don’t post pictures or anything.

I also use TikTok but for product reviews mainly.

Is this bad? I’m considering deleting the apps but can’t get to delete it?

I was also thinking of deleting these apps and creating a dummy accounts with fake data so it’s as bad

What do you all say? Thanks

6
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/No_Inspector_2784 on 2023-07-07 02:25:46+00:00.


Hi all,

This may be a noobie question but a lot of the Threads chatter has got me thinking about my permissions.

If you look on the App Store, there is obviously a lot of concerning data that Threads for example (along with many other apps) collects. However, under my permissions, I only allow Threads access to 'Notifications'. Other things like 'Camera', 'Microphone' are all 'Not Allowed'. Is Threads still collecting all that concerning data listed in the Play Store under Data Safety or have I limited it by reviewing and minimising permissions?

TIA.

7
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/wewewawa on 2023-07-07 00:22:08+00:00.

8
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/theejpp on 2023-07-06 23:44:19+00:00.


I want something that is short and professional-ish, but don't know if doing something like my initials is too risky, or if it wouldn't really matter. Would that work? And if not, what sort of subdomain SHOULD I create?

9
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/geekgeek2019 on 2023-07-06 23:27:15+00:00.


Hello

I hope everyone is well and safe

I was super excited for threads and even joined it ugh and then it HIT ME its ZUCK's.

Anyways after lots of research, I kinda wanna get off the meta apps. I was away from IG for almost an year and it was amazing till I needed the app for a class.

I am a uni student, and my uni groups are on FB. Deleting that would be the most difficult. Then I also have some support groups, reviews groups and city/country specific groups to help navigate the city (I am studying abroad). That part is getting tough. So, while I do uninstall IG, how do I go about the FB group esp in the long run where I need it for such groups? I do plan to have a dummy IG account if needed (I shop from IG stores so but will have no friends or connections to my contacts)

lmk what yall do instead!

10
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/Bread-bastard2492 on 2023-07-06 23:25:42+00:00.


More specifically, what is the chain of operations? Do they start with a simple Google search and go from there or is there some database with tons of names and information they can access from some data broker?

11
1
android gboard (lemmy.beyondcombustion.net)
submitted 1 year ago by [email protected] to c/[email protected]
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/JUSTWANTTOKNO2022 on 2023-07-06 23:07:55+00:00.


can i use gboard, whilst blocking the connections with netguard, or would google still get some type of data even when it's blocked?

i keep looking for openboard but it looks like it got deleted or something.

12
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/TyrannicalDuncery on 2023-07-06 22:12:04+00:00.


Stop me at any point if I say something dumb, I could be missing something.

  • I have an easy-to-guess personal email address.
  • Until recently, there has been a spot on the internet (on an official institutional website) that shows my name, some context about me, and my email address all together.
  • Occasionally I have used that as a quick way to prove to a stranger that I am probably me in an email.
  • (I think they would have trusted me regardless--FOOLS! mwahaha--so not sure how helpful that really was.)
  • Recently, that page disappeared from the (normie) internet; the page is down, it's not on archive.org, and it's not googlable for me (can't get the cached version). Maybe because it contained my personal info! :)

So now I have an interesting situation: should I post my email and name together on another official institutional website? What would the risks and benefits be?

I'm leaning towards no; I'm thinking that any risks or benefits would be very small, but there might be some cybersecurity and anti-spam benefit to keeping my email hard-to-find.

13
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/borednerdd on 2023-07-06 21:53:48+00:00.


ZUCK: yea so if you ever need info about anyone at harvard

ZUCK: just ask

ZUCK: i have over 4000 emails, pictures, addresses, sns

FRIEND: what!? how’d you manage that one?

ZUCK: people just submitted it

ZUCK: i don’t know why

ZUCK: they “trust me”

ZUCK: dumb fucks

14
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/BackgroundMaximum373 on 2023-07-06 20:46:24+00:00.


I wish this was a joke but it's a serious question...

I might be paranoid but I have seen an analysis being done by someone that compared different online users, speech patterns and used AI to try and predict who the alts are...

It made me think, that kind of patterns is what AI will look into. Knowing how much money is being poured into AI online research (especially privately, by governments), I wouldn't be surprised at what kind of data they already have on us...

Thoughts and suggestions?

15
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/interwebzdotnet on 2023-07-06 18:42:52+00:00.

16
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/LoadPlus6558 on 2023-07-06 18:33:19+00:00.


Which is better or maybe there is a better browser?

17
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/eshields99 on 2023-07-06 16:50:31+00:00.


Im looking for the most secure cloud storage solution for personal use. Not ready to setup a NAS system yet.

So far Filen, ice drive and Pcloud looks interesting...

Any advice?

18
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/vjeuss on 2023-07-06 16:33:32+00:00.


It appears the UK is now a member of some association.of countries about Data Protection. This: https://www.gov.uk/government/news/uk-gets-new-status-in-global-data-privacy-certification-programme

Never heard of this and the list of countries is on the terrifying scale.

They even say

UK positioned to help shape practical solutions in building a global data transfers system

...focus on "practical", I guess.

19
1
ChatGPT conversations (lemmy.beyondcombustion.net)
submitted 1 year ago by [email protected] to c/[email protected]
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/kimchipappi on 2023-07-06 14:09:08+00:00.


Hi, I’m 16 and I have been using chatgpt for around 4 months now.

I recently exported my data and I now feel very uncomfortable because I’ve said some stuff that I shouldn’t have said and definitely do NOT want stored on OpenAI’s database.

Simply put, I want my conversations to be deleted from OpenAI’s database.

Is this possible?

I saw that “all your data, including profile, conversations, and API usage, will be removed” if I delete my account, but will this only delete it on my end, or will it ACTUALLY delete it completely from OpenAI’s server/database so that no one can access it or even know it existed?

and please don’t say anything like “it can’t be that bad” or “you’ll be fine” because I am genuinely curious.

I’d really appreciate an honest answer.

Thanks in advance!

20
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/Der-Ubermensch on 2023-07-06 13:54:09+00:00.


From risqué photos to voice recordings, from family tree apps to personal journals, there are a multitude of smartphone apps that are incredibly handy yet incredibly private.

Smartphones are mobile by their nature, making this data vulnerable to both physical theft and accidental loss. For those with particular threat models, they are also vulnerable to remote hacking, such as the Israeli spy firm NSO Group with their iMessage exploit. Phones are always with us and provide little benefit if never connected to a network. Whilst some may point to the ability to brick stolen phones (providing they haven’t been turned off/stored in a faraday cage before an attempt at data recovery by the thief), the number of thefts of unlocked phones by thieves on bikes as the victim walks down the street is ever increasing.

As a result, I think it worth discussing the following question:

Should we reserve the storing of our most private/sensitive data for computers kept at home? The theft of desktop computers is rarer, accidental loss virtually zero and they still provide value even if not constantly pinging cell towers/satellites (if you choose to use them in this way). Should we view our phone as satellites of our computers - providing key functions and able to capture sensitive data which is later transferred to the computer for long term secure storage?

Or

Do smartphones offer such convenience and consolidation of devices that these things outweigh any potential risks?

Let’s discuss! Thanks.

21
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/mkbt on 2023-07-06 13:33:35+00:00.

22
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/MolinaGames on 2023-07-06 10:40:59+00:00.


I have been informing myself lately about privacy and the truth is that it seems to me a quite interesting topic. the problem is that I NEED to use Instagram, Google Maps, TikTok, YouTube, Spotify or Windows. What is the use of starting to use a private email, Firefox or replacing some Google apps if in the end I am going to have to use other tools that do not respect my privacy?

23
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/4D4M-ADAM on 2023-07-06 10:20:38+00:00.

24
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/AvnarJakob on 2023-07-06 09:39:58+00:00.


I see a lot of people here arguing that everyone should selfhost. But that isnt posible for most people. Either they arent willing to make the efford of setting up their own Home Lab or they just cant afford it. And even if they where able to do it most dont care enough for their Privacy.

Technology like Block chain also wont, because of the same ignorance, the lack of understanding and therefore Trust into these Systems

The Problem is the Profit Motive that forces Companies to be their most ruthless if they want to survive against their Competition.

The only solution is Goverment action. That wont happen under the current system because Billonares own most Goverments.

I havent seen any other solution that would attack the root of the Problem other than the end of Capitalism.

25
 
 
Crossposted using Lemmit.

Original post from /r/privacy by /u/cranberry_knight on 2023-07-06 08:16:58+00:00.

view more: next ›