pixel-prompt / README.md
Hatman's picture
Update README.md
ca03b45 verified
|
raw
history blame
2.55 kB
---
title: Pixel Prompt
emoji: πŸ”₯
colorFrom: red
colorTo: blue
sdk: docker
pinned: false
license: mit
---
# Pixel Prompt JS
This repository contains a static React Native application built using Lambda as a backend. It's serving several diffusion models that use the huggingface [inference-api](https://huggingface.co/docs/api-inference/index). A blog post explaining this deployment and the HuggingFace Inference API can be found [here](https://medium.com/@HatmanStack/cloud-bound-hugging-face-spaces-1101c569690d).
## Code :zap:
To preview the application visit the hosted version on the Hugging Face Spaces platform [here](https://hatman-pixel-prompt.hf.space).
## Installation πŸ’»
To generate the static content for this container have a working Node/npm installation and clone the [pixel-prompt-frontend](https://github.com/HatmanStack/pixel-prompt-frontend) repo. Run these commands in the pixel-prompt-frontend root directory to generate static content.
```shell
npm install -g yarn
yarn
npx expo export:web
```
Static files are output to the web-build folder in the root directory. Move the web-build from your pixel-prompt-frontend root directory to your pixel-prompt-container root directory. **To reach the endpoint from the frontend use `/api` in the pixel-prompt-frontend NOT `http://localhost:7860/api`**
Add your HF_TOKEN variable as an environmental variable in your container settings.
## Models ⚑
All models are opensource and available on HuggingFace.
### Diffusion
- **Random**
- **fal/AuraFlow**
- **stabilityai/stable-diffusion-3-medium**
- **stabilityai/stable-diffusion-xl-base-1.0**
- **nerijs/pixel-art-xl**
- **Fictiverse/Voxel_XL_Lora**
- **dallinmackay/Van-Gogh-diffusion**
- **gsdf/Counterfeit-V2.5**
### Prompts
- **Gustavosta/MagicPrompt-Stable-Diffusion**
- **meta-llama/Meta-Llama-3-70B-Instruct**
## Functionality
This App was creating using the HuggingFace Inference API. Although Free to use, some functionality isn't available yet. The Style and Layout switches are based on the IP adapter which isn't supported by the Inference API. If you decide to use custom endpoints this is available now.
## License
This project is licensed under the [MIT License](LICENSE)
## Acknowledgments πŸ†
This application is built with Expo, a powerful framework for building cross-platform mobile applications. Learn more about Expo: [https://expo.io](https://expo.io)
This application is using the HuggingFace Inference API, provided by <a href="https://huggingface.co">HuggingFace</a>