Maximus Powers

maximuspowers

AI & ML interests

Ethical AI research, synthetic data, web scraping

Recent Activity

liked a Space 19 days ago
ethical-spectacle/gusnet-v1-demo
updated a Space 19 days ago
ethical-spectacle/README
updated a Space 19 days ago
ethical-spectacle/gusnet-v1-demo
View all activity

Organizations

News Media Biases's profile picture Ethical Spectacle's profile picture Hugging Face Discord Community's profile picture

maximuspowers's activity

reacted to Taylor658's post with πŸ‘πŸ‘€ 5 months ago
view post
Post
2352
πŸ’‘Andrew Ng recently gave a strong defense of Open Source AI models and the need to slow down legislative efforts in the US and the EU to restrict innovation in Open Source AI at Stanford GSB.

πŸŽ₯See video below
https://youtu.be/yzUdmwlh1sQ?si=bZc690p8iubolXm_
Β·
reacted to merve's post with πŸ‘ 5 months ago
view post
Post
2390
NVIDIA just dropped NVEagle πŸ¦…

Super impressive vision language model that comes in 7B, 13B and 13B fine-tuned on chat πŸ’¬
Model repositories: merve/nveagle-66d0705108582d73bb235c26
Try it: NVEagle/Eagle-X5-13B-Chat πŸ’¬ (works very well! 🀯)

This model essentially explores having different experts (MoE) for image encoder part of vision language model.
How? 🧐
The authors concatenate the vision encoder output tokens together, and they apply "pre-alignment" essentially fine-tune experts with frozen text encoder.

Then they freeze both experts and the decoder and just train the projection layer, and finally, they unfreeze everything for supervised fine-tuning ✨

In the paper, they explore different fusion strategies and vision encoders, extending basic CLIP encoder, and figure out simply concatenating visual tokens works well.
Rest of the architecture is quite similar to LLaVA. (see below the architecture)
reacted to TuringsSolutions's post with 🧠 5 months ago
view post
Post
1405
The word 'Lead' has three definitions. When an LLM model tokenizes this word, it is always the same token. Imagine being able to put any particular embedding at any particular time into a 'Quantum State'. When an Embedding is in a Quantum State, the word token could have up to 3 different meanings (x1, x2, x3). The Quantum State gets collapsed based on the individual context surrounding the word. 'Jill lead Joy to the store' would collapse to x1. 'Jill and Joy stumbled upon a pile of lead' would collapse to x3. Very simple, right? This method produces OFF THE CHARTS results:


https://www.youtube.com/watch?v=tuQI6A-EOqE
posted an update 5 months ago
view post
Post
2533
Here's my favorite piece of the summer bias detection research project (paper coming in Sept). We trained BERT for token classification (multi-label), to identify:
- Generalizations
- Unfairness
- Stereotypes

HF Space: maximuspowers/bias-detection-ner
Article on Training: https://huggingface.co/blog/maximuspowers/bias-entity-recognition

Pls reach out with ideas!! Lot's more info coming soon, our research group has workshops and a hackathon planned for launching this open source project. Thanks
reacted to davanstrien's post with 🧠 6 months ago
view post
Post
3178
πŸš€ Introducing Hugging Face Similar: a Chrome extension to find relevant datasets!

✨ Adds a "Similar Datasets" section to Hugging Face dataset pages
πŸ” Recommendations based on dataset READMEs
πŸ—οΈ Powered by https://huggingface.co/chromadb and https://huggingface.co/Snowflake embeddings.

You can try it here: https://chromewebstore.google.com/detail/hugging-face-similar/aijelnjllajooinkcpkpbhckbghghpnl?authuser=0&hl=en.

I am very happy to get feedback on whether this could be useful or not πŸ€—
Β·
reacted to alvarobartt's post with πŸ”₯ 6 months ago
view post
Post
2967
πŸ€— Serving Meta Llama 3.1 405B on Google Cloud is now possible via the Hugging Face Deep Learning Containers (DLCs) for Text Generation Inference (TGI)

In this post, we showcase how to deploy https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct-FP8 on an A3 instance with 8 x H100 GPUs on Vertex AI

Thanks to the Hugging Face DLCs for TGI and Google Cloud Vertex AI, deploying a high-performance text generation container for serving Large Language Models (LLMs) has never been easier. And we’re not going to stop here – stay tuned as we enable more experiences to build AI with open models on Google Cloud!

Read the full post at https://huggingface.co/blog/llama31-on-vertex-ai
reacted to victor's post with ❀️ 6 months ago
view post
Post
4137
How good are you at spotting AI-generated images?

Find out by playing Fake Insects 🐞 a Game where you need to identify which insects are fake (AI generated). Good luck & share your best score in the comments!

victor/fake-insects
Β·
replied to victor's post 6 months ago
view reply

I was waiting for it to say "Jk they're all fake"