√𝛂𝛋𝛆

  • 71 Posts
  • 355 Comments
Joined 7 months ago
cake
Cake day: July 5th, 2025

help-circle









  • Most of the major developments that lead to the public stuff happened between 2017-2021. Transformers was the big one that made scaling a thing. Altman pushed in a stupid direction that caused a lot of the nonsense, like turning the name “Open AI” into an oxymoron.

    There are some aspects of alignment that point at political corruption and planning with nefarious intent that fits in with the present political bullshit too, but that is very complicated to explain in any depth. If you were to search the token vocabulary, you will find dubious elements are present in compound multi word tokens that disproportionately represent a single political camp, likewise with religious media, and science denialism. Much of that stuff dates from 2019 or before.


  • Scripting and the command line is an option. It may seem a bit daunting at first but can be a lot faster. Doing AI training stuff I tried brute forcing for awhile but it is just too tedious. A model like a Qwen 3 can be instructed to act like a photo captioning model. It is quite good at optical text recognition in images too.

    You can script it to caption the whole image or any number of elements like color, location, subjects, etc., while also limiting the length of text. Then modify the image metadata with the text. It is also possible to feed it a set of json keys and it will fill in the values for the dictionary.

    If you go this route, then you start using tools like ripgrep and it becomes possible to find and manipulate many thousands of images quickly to find niche sets in the tens to hundreds in just a single scripted command. You can do stuff like grep -ril foo | xargs | sed ‘s/foo/bar/’ or something like that. Not at my comp right now to pull up the command flags, but that is how to search and edit to change all instances of words in a bunch of files. That one is technically incorrect but it makes far more intuitive sense than alternatives.

    Something like Qwen requires enthusiast level hardware, or if you have to, a free google colab instance. If you search for captioning models there are much smaller ones like the ancient BLIP that are less accurate but fast and run on anything. Anyways, I know this reply is not exactly what you wanted, but it is an option, and one I thought was beyond me until I tried it. Hopefully it is useful digital neighbor.


  • You need a model compiled for the architecture. I saw some for the RK35xx devices when shopping for hardware. I do not think there is software made to split up or run models in general on a NPU. The models must be configured for the physical hardware topology. The stuff that runs on most devices is very small, and these either need a ton of custom fine tuning or they are barely capable of simple tasks.

    On the other hand, segmentation models are small, and that makes layers, object identification, and background removal stuff work. Looking at your CPU speed, and available memory, it is unlikely to make much difference. You are also memory constrained for running models, though you could use deepspeed to load from a disk drive too.





  • √𝛂𝛋𝛆@piefed.worldtoMemes@sopuli.xyzTotally
    link
    fedilink
    English
    arrow-up
    23
    ·
    9 days ago

    Greek culture lacked any binary distinction.

    I’m no expert on the subject, but reading Plato’s dialogues lately, the Athenians of the era just before Alexander had no preclusive prejudice for gendered relations. That said, the human demographic in Plato’s dialogues is very much biased towards the upper class of society, and I believe that has always slanted towards social exception through hierarchy, with a special place for the rogue aberrant who strings the bow of dogma at the edge of the tribe.


  • I wanted to hotrod and play with cars. One silly magazine I read had an article about how automotive paint was a dark art few ventured into. So I started with that in my teens. I did about everything one can do with cars. I worked with heavy equipment for awhile just to get a bunch of welding experience, then went back to painting cars but doing bigger jobs and repairs. Eventually I did not like the state I was in so I started cycling to hotrod me. The cars started fighting back as more and more of me was lost without need to carry me around. One day commuting to work, two scrappy SUVs teamed up against me on a bike. I killed them both, but the broke my neck and back. I guess I wish I had not read that article… I have never really thought of work as a means to an end per say. I cannot imagine staying motivated by that. I am not the type to go along with bullshit or do what I’m told. Give me a responsibility and I’ll do a far better job than anyone else and eventually I’ll become the back office manager the owner trusts completely. I hate managing people. No task is beneath me. I do not care about narcissistic nonsense from myself or anyone else. I do what needs to be done and others are welcome to do the same around me. Those that do not are none of my concern. I am the employee I wished I could find when I ran my own business… or at least I was. Now, I am a shell of my former self, with 8 of my 9 cat lives spent.


  • Clouds. Messing with optical properties of the universe has far greater consequences in physics. Don’t ask what. I do not recall more than simply seeing someone respond to a similar thing and talk about all the implications of altering… I think it was an atheist thing where someone was proving the universe could not exist if optical properties changed “after the flood” like the mythical nonsense in the bible. I just remember thinking, yup, never gonna mess with that shit… So clouds… Can I be Eros?



  • On that level, maybe invert your mindset and look in Maker spaces. Search by hardware like ESP32. You will likely get better (different) results if you search for devices that target EE students instead of those that target Makers in general. Like it is well known that Texas Instruments will send free samples of most common chips requested, to anyone with a .edu email. Projects on hardware like a Beagle Bone tend to be more advanced than more common Maker hardware. While a BB is like half of a Rπ in terms of hardware architecture, if a purpose made device is created without all of the extra overhead fluff, it is pretty good. The STM32 H7 stuff tends to have advanced projects at the handheld gaming level. The Nordic BT BLE chips are usually more popular with the advanced crowd.

    You might look at the hardware commits for Micropython or Circuit Python for people adding DACs or other peripherals. These are likely to lead to their project spaces.

    I’ve seen someone doing a drive swap on an old iPod to SSD and a software chain, but I think that was still only doing the Apple compatibility thing.

    OpenWRT is not a bad place to look either. Any small embedded Linux device is likely to run on OpenWRT, so you may find something interesting just by shopping their hardware support and commit history.