• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle




  • “[an] integrated vehicle system that uses, at minimum, the GPS location of the vehicle compared with a database of posted speed limits, to determine the speed limit, and utilizes a brief, one-time visual and audio signal to alert the driver each time they exceed the speed limit by more than 10 miles per hour.”

    Honestly the only part of this that is unreasonable is that it isn’t immediately followed with “the database updates will be maintained and provided in an open, unencrypted format for free for the life of the vehicle, and the tracking data cannot be used for any other purpose”. GPS is a one-way, triangulation-based signal. It doesn’t inherently track or leak anything. I think we would be a lot safer if we all could agree what speed to go.



  • Businesses “follow the constitution” here. The nuance is that the first amendment (freedom of speech) explicitly only applies to consequences from government. As a private corporation, the people running Harvard have the right to their own speech, in this case: a policy denying graduation, without consequence from the government.

    I in no way endorse the speech that Harvard is expressing, but I do have the right to impose my own consequences on them for it (I.E not supporting things they do financially, disparaging them in an online forum like Lemmy, etc). The constitution prevents the US government from punishing Harvard for these actions in the same ways, unless a law has explicitly been broken.




  • It doesn’t need csam data for training, it just needs to know what a boob looks like, and what a child looks like. I run some sdxl-based models at home and I’ve observed it can be difficult to avoid more often than you’d think. There are keywords in porn that blend the lines across datasets (“teen”, “petite”, “young”, “small” etc). The word “girl” in particular I’ve found that if you add that to basically any porn prompt gives you a small chance of inadvertently creating the undesirable. You have to be really careful and use words like “woman”, “adult”, etc instead to convince your image model not to make things that look like children. If you’ve ever wondered why internet-based porn generators are on super heavy guardrails, this is why.



  • Hey now, lets not exaggerate and hyperbolize. There are types of non-ad data in this message. “Hello!” isn’t an ad. Neither are the links for “Pay Rent” or “Request Maintenance”. By pixel count that has to be at least 3% of the message!

    Also, I’m sure there’s a tracking pixel somewhere, probably embedded in the CDNs for those images so that they can know when and where you opened this message, what type of device your on, etc. That’s creepy tracking data not advertising! (yet)

    Kids these days, never happy with anything.