AI & ML interests

sharing knowledge

Recent Activity

HuggingFaceDocBuilderย  updated a dataset 1 minute ago
hf-doc-build/doc-build-dev
HuggingFaceDocBuilderย  updated a dataset 17 minutes ago
hf-doc-build/doc-build
HuggingFaceDocBuilderย  updated a dataset 29 minutes ago
hf-doc-build/doc-build-dev
View all activity

hf-doc-build's activity

regisssย 
posted an update about 2 months ago
regisssย 
posted an update 4 months ago
view post
Post
1415
Interested in performing inference with an ONNX model?โšก๏ธ

The Optimum docs about model inference with ONNX Runtime is now much clearer and simpler!

You want to deploy your favorite model on the hub but you don't know how to export it to the ONNX format? You can do it in one line of code as follows:
from optimum.onnxruntime import ORTModelForSequenceClassification

# Load the model from the hub and export it to the ONNX format
model_id = "distilbert-base-uncased-finetuned-sst-2-english"
model = ORTModelForSequenceClassification.from_pretrained(model_id, export=True)

Check out the whole guide ๐Ÿ‘‰ https://huggingface.co/docs/optimum/onnxruntime/usage_guides/models
Wauplinย 
posted an update 4 months ago
view post
Post
2992
What a great milestone to celebrate! The huggingface_hub library is slowly becoming a cornerstone of the Python ML ecosystem when it comes to interacting with the @huggingface Hub. It wouldn't be there without the hundreds of community contributions and feedback! No matter if you are loading a model, sharing a dataset, running remote inference or starting jobs on our infra, you are for sure using it! And this is only the beginning so give a star if you wanna follow the project ๐Ÿ‘‰ https://github.com/huggingface/huggingface_hub
  • 1 reply
ยท
Wauplinย 
posted an update 5 months ago
view post
Post
4675
๐Ÿš€ Exciting News! ๐Ÿš€

We've just released ๐š‘๐šž๐š๐š๐š’๐š—๐š๐š๐šŠ๐šŒ๐šŽ_๐š‘๐šž๐š‹ v0.25.0 and it's packed with powerful new features and improvements!

โœจ ๐—ง๐—ผ๐—ฝ ๐—›๐—ถ๐—ด๐—ต๐—น๐—ถ๐—ด๐—ต๐˜๐˜€:

โ€ข ๐Ÿ“ ๐—จ๐—ฝ๐—น๐—ผ๐—ฎ๐—ฑ ๐—น๐—ฎ๐—ฟ๐—ด๐—ฒ ๐—ณ๐—ผ๐—น๐—ฑ๐—ฒ๐—ฟ๐˜€ with ease using huggingface-cli upload-large-folder. Designed for your massive models and datasets. Much recommended if you struggle to upload your Llama 70B fine-tuned model ๐Ÿคก
โ€ข ๐Ÿ”Ž ๐—ฆ๐—ฒ๐—ฎ๐—ฟ๐—ฐ๐—ต ๐—”๐—ฃ๐—œ: new search filters (gated status, inference status) and fetch trending score.
โ€ข โšก๐—œ๐—ป๐—ณ๐—ฒ๐—ฟ๐—ฒ๐—ป๐—ฐ๐—ฒ๐—–๐—น๐—ถ๐—ฒ๐—ป๐˜: major improvements simplifying chat completions and handling async tasks better.

Weโ€™ve also introduced tons of bug fixes and quality-of-life improvements - thanks to the awesome contributions from our community! ๐Ÿ’ช

๐Ÿ’ก Check out the release notes: Wauplin/huggingface_hub#8

Want to try it out? Install the release with:

pip install huggingface_hub==0.25.0

  • 1 reply
ยท
Wauplinย 
posted an update 7 months ago
view post
Post
1998
๐Ÿš€ Just released version 0.24.0 of the ๐š‘๐šž๐š๐š๐š’๐š—๐š๐š๐šŠ๐šŒ๐šŽ_๐š‘๐šž๐š‹ Python library!

Exciting updates include:
โšก InferenceClient is now a drop-in replacement for OpenAI's chat completion!

โœจ Support for response_format, adapter_id , truncate, and more in InferenceClient

๐Ÿ’พ Serialization module with a save_torch_model helper that handles shared layers, sharding, naming convention, and safe serialization. Basically a condensed version of logic scattered across safetensors, transformers , accelerate

๐Ÿ“ Optimized HfFileSystem to avoid getting rate limited when browsing HuggingFaceFW/fineweb

๐Ÿ”จ HfApi & CLI improvements: prevent empty commits, create repo inside resource group, webhooks API, more options in the Search API, etc.

Check out the full release notes for more details:
Wauplin/huggingface_hub#7
๐Ÿ‘€
ยท
Wauplinย 
posted an update 7 months ago
view post
Post
3369
๐Ÿš€ I'm excited to announce that huggingface_hub's InferenceClient now supports OpenAI's Python client syntax! For developers integrating AI into their codebases, this means you can switch to open-source models with just three lines of code. Here's a quick example of how easy it is.

Why use the InferenceClient?
๐Ÿ”„ Seamless transition: keep your existing code structure while leveraging LLMs hosted on the Hugging Face Hub.
๐Ÿค— Direct integration: easily launch a model to run inference using our Inference Endpoint service.
๐Ÿš€ Stay Updated: always be in sync with the latest Text-Generation-Inference (TGI) updates.

More details in https://huggingface.co/docs/huggingface_hub/main/en/guides/inference#openai-compatibility
ยท
Wauplinย 
posted an update 9 months ago
view post
Post
1829
๐Ÿš€ Just released version 0.23.0 of the huggingface_hub Python library!

Exciting updates include:
๐Ÿ“ Seamless download to local dir!
๐Ÿ’ก Grammar and Tools in InferenceClient!
๐ŸŒ Documentation full translated to Korean!
๐Ÿ‘ฅ User API: get likes, upvotes, nb of repos, etc.!
๐Ÿงฉ Better model cards and encoding for ModelHubMixin!

Check out the full release notes for more details:
Wauplin/huggingface_hub#6
๐Ÿ‘€
Wauplinย 
posted an update 11 months ago
view post
Post
2328
๐Ÿš€ Just released version 0.22.0 of the huggingface_hub Python library!

Exciting updates include:
โœจ Chat-completion API in the InferenceClient!
๐Ÿค– Official inference types in InferenceClient!
๐Ÿงฉ Better config and tags in ModelHubMixin!
๐Ÿ† Generate model cards for your ModelHubMixin integrations!
๐ŸŽ๏ธ x3 download speed in HfFileSystem!!

Check out the full release notes for more details: Wauplin/huggingface_hub#5 ๐Ÿ‘€
  • 2 replies
ยท
Wauplinย 
posted an update 12 months ago
view post
Post
๐Ÿš€ Just released version 0.21.0 of the huggingface_hub Python library!

Exciting updates include:
๐Ÿ–‡๏ธ Dataclasses everywhere for improved developer experience!
๐Ÿ’พ HfFileSystem optimizations!
๐Ÿงฉ PyTorchHubMixin now supports configs and safetensors!
โœจ audio-to-audio supported in the InferenceClient!
๐Ÿ“š Translated docs in Simplified Chinese and French!
๐Ÿ’” Breaking changes: simplified API for listing models and datasets!

Check out the full release notes for more details: Wauplin/huggingface_hub#4 ๐Ÿค–๐Ÿ’ป
ยท