Need it for my research project on pollution in Europe. But also for general stuff, so not a science based one, just general.

  • k110111@feddit.de
    link
    fedilink
    arrow-up
    35
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Okay as a broke ass student in a similar situation imma try to help you out before someone offers better advice.

    So you have these options: Definitely legal:

    1. self host local llama on colab.

    2. try hunggingface spaces, you can find a lot of models

    3. coehere.ai, a21, crebrium.ai (free trials/freemium)

    4. try poe.com (really awesome gpt3.5)

    5. perplexity.ai (good for researching articles)

    6. you.com (it has internet access like bing but generally more descriptive and useful than bing)

    7. bing ai (trys too hard to not hallucinate)

    8. claude2 on claude.ai (only available in us and uk so you might need vpn for account creation)

    Kinda legal (i think)

    1. gptforfree on github (its reversed engineered from other places)

    2. chimeragpt discord (gives you free openai api key, includes gpt4 and image generation etc)

    3. pocketgpt (based on chimeragpt, this is the closest to what you want. Gpt4 directly in your browser but sometimes it does not work so you might want to try again later)

    My piece of advice, use multiple tools since they all kinda have have different advantages. I mostly used gpt4 for smaller things and claude for larger (it has 100k context!)

    I have to look up url for pocketgpt, its kinda a small project so couldn’t find it on google. Meanwhile try this if it works https://chat.ylokh.xyz/

    Link for pocketgpt: naturegpt.000webhost.com/pocketgpt

    • SurpriZe@lemm.eeOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Thanks a lot! Will definitely try all of these. Selfhosting looks kinda difficult and ineffective compared to other options, or am I missing something?

      • k110111@feddit.de
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I kinda got lost in making that list (just aspergers things) and listed every model i knew.

        For a layperson, yeah self hosting isn’t as effective yet. But if someone who studies AI (like me), self hosting is a must. Some use cases are:

        Retrain on your own data (big market potential)

        Make your own bots with specific applications/use cases (like parse wikipedia before answering)

        Bypass censorship (funny story, my friend asked claude to summarize a book on dystopia and it kept telling her to talk about something else cuz dystopia’s were too depressing for claude)

        I’ve even heard about models that are specialized for just one task like chatting, or logic puzzles

        And lastly, privacy nerds like me

  • Ziggurat@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    Would running locally be an option ? You can check the GPT4All project which let you run LLM locally, it’s obviously slower than chatGPT, but you keep data on your PC, and is free as in free speech and as in free beer

        • mind@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Maybe, it depends on how serious this is.

          A local llama2 or openassist install could keep him off the grid, if this needs to be clandestine.

          Those required logins always end up being tied to one phone number, and VoIP numbers are always blocked. So they have a real name, anyone can look up the name on the account, no special access needed.

          If you don’t want every AI interaction tied to your real name, there’s ways to do it. 👁️

          • eating3645@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Maybe, it depends on how serious this is.

            Total agreement, small scale LLMs are super cool but just don’t have as high quality output, if they’re good enough for the job, they’re perfect.

    • subspaceinterferents@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I use Claude.ai a lot, and I like it. It can and will gin up a lot of nonsense, on occasion. I’ve asked it some pointed questions on arcane subjects I know details about, and it’s spit out a pack of crap. I always preface my query with a statement asking Claude to admit when it doesn’t have a factual answer, rather than making one up out of whole cloth. Seems to work.

    • fitz@linkopath.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’ll second this and also add that it’s ability to allow documents to be uploaded to read and summarize are really good too