• 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle



  • I’ve setup okular signing and it worked, but I believe it was with a mime certificate tied to my email (and not pgp keys). If you want I can try to figure out exactly what I did to make it work.

    Briefly off the top of my head, I believe it was

    1. Getting a mime certificate for my email from an authority that provides them. There’s one Italian company that will do this for any email for free.
    2. Converting the mime certificate to some other format
    3. Importing the certificate to Thunderbird’s (or maybe it was Firefox’s) certificate store (and as a sidequest setting up Thunderbird to sign email with that certificate
    4. Telling Okular to use the Thunderbird/Firefox certificate store as the place to find certificates

    I can’t remember if there was a way to do this with pgp certificates easily





  • I’d be surprised if it was significantly less. A comparable 70 billion parameter model from llama requires about 120GB to store. Supposedly the largest current chatgpt goes up to 170 billion parameters, which would take a couple hundred GB to store. There are ways to tradeoff some accuracy in order to save a bunch of space, but you’re not going to get it under tens of GB.

    These models really are going through that many Gb of parameters once for every word in the output. GPUs and tensor processors are crazy fast. For comparison, think about how much data a GPU generates for 4k60 video display. Its like 1GB per second. And the recommended memory speed required to generate that image is like 400GB per second. Crazy fast.