Saw this today and now I'm reconsidering if Boost is right for me. I'm really hoping this is shitty boiler plate that was accidentally copied and over looked because that is some bullshit to say "unless we decide we want to use your personal data for whatever we want".
I know "legitimate interest" is a phrase from the cookies law but there is no legitimate interest justification for this. My data is my data and I decide who has a legitimate interest in it so advertisers can fuck off, as can Boost if this the direction it's going.
Edit to say this blew up. I didn't realise I was kicking as big a hornet's nest and haven't read all the comments yet.
To be clear, what I don't like about this and other provisions in the terms is the language and implications around data use. I've no problem with ads being shown - I want developers to get paid for the work they do and that makes it possible for users to have "free" access to software if they can't afford to purchase.
I also want to add the response from Boost's dev below to make sure it's visible. You'll see that it is boilerplate but required by Google and was present in Boost for reddit. I just hadn't seen it because I purchased it immediately based on a recommendation. It doesn't make me happy about it but does remove some doubts I was having about the direction Boost is heading.
I will be purchasing the app to support the dev because I do like Boost but I understand not everyone can afford everything so you'll see some other suggestions in the comments below that don't have any ads if you're not happy with the free version and ads with their associated loss of data privacy.
Dev here.
The dialog and its content is not created by me, it is a standard solution from Google to comply with GDPR and other laws. More info here: https://support.google.com/admob/answer/10114014?hl=en
The consent dialog is also required by Google AdMob to show ads, and it is shown when the ad network is initialized.
When the app launches, first it checks for the remove ads purchase, and if it is not present, it will initialize the ads sdk. The ad network is not initialized if the remove ads purchase is detected.
Boost for Reddit was using the very same ad networks and consent dialog.
That is not correct, or at least it's incomplete. You make it sound like only the company's interest matters, but it always has to always take into account the interests of the data subject as well, and if the two are at odds, you need to make a judgment on how to balance those.
Storing IP addresses for example falls into this - there is a legitimate interest of the company to keep its IT systems protected, and to do that effectively, storing IP addresses is necessary. This interest weighs pretty highly, and since the expected effect on the subject is minimal and there's no less invasive way to achieve the same result, it's okay to do without a way for the user to opt out.
Error tracking, is already a little more tricky - you need to have a good argument why you actually need the personal data to effectively find and fix issues, because most of the time, there's ways to do that without processing personal data just as effectively (beyond the IP address of course being used when sending error reports).
Of course this is all just theory, and in practice, companies will often try to get away with way more liberal interpretations of what constitutes legitimate interest. My point is that legitimate interest as a concept is not the problem, and is actually necessary for the whole thing to work. The problem is companies bending the law and not properly being regulated.
And how often do you suppose that the judgement is made in favor of the data subject? To protect privacy?
Because I am going to guess never, and suggest it is naive to believe that it would ever occur.
That's not how it works. The judgment isn't a preference, it's a decision made based on interpretation of the legislation.
If you decide to judge in favour of storing data and a legal body finds you in breach, you'll be fined and forced to change your interpretation.
That's why it's such a minefield. Yes it can be abused, but yes you might get legally devastated for doing so.
The idea that companies can flippantly choose their preferred interpretation is paranoia, not the reality of how GDPR works.