--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: mpnet-multilabel-sector-classifier results: [] datasets: - GIZ/sector_data co2_eq_emissions: 0.276132 widget: - text: "Forestry, forestry and wildlife: Vulnerability will be globally high to very high in zones 4 and 5, high to medium in the rest of the country but with strong trends in woodlands (droughts, extreme events);. - Water, sanitation and health: Vulnerability will be globally strong to very strong in zones 4 and 5, strong to medium in the rest of the country but with strong trends in the forested massifs (drought, floods and ground movement)" example_title: "Disaster Risk Management (DRM), Water, Environment" - text: "Change fiscal policies on fossil fuel by 2025 to enable the transition to 100% renewable energy generation in the transportation sector" example_title: "Transport, Energy" - text: "Implementation of the electro-optical channel regulations for the distributed electricians, technicians in other regions and cities. 2- An integrated nationalization that complements the use of smart meter technology inside buildings. 3- Integrated solar photovoltaic in buildings. 4- Support your company and use it from local women s clubs and local producers. Waste. 1- Setting up waste management laws, which encourages the transfer of waste into bottles and bottles, we will burn the waste streams and reduce waste. 1- We use the appropriate regulation in our time to remove electrical and electrical rations from waste. 2- An integrated application for waste management. 3- Investing fire methane on landfill sites. Farming. 1- Nannnai to protect and increase the natural gaunanat" example_title: "Social Development, Waste, Urban, Buildings" --- # mpnet-multilabel-sector-classifier This model is a fine-tuned version of [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2273 - Precision Micro: 0.8075 - Precision Weighted: 0.8110 - Precision Samples: 0.8365 - Recall Micro: 0.8897 - Recall Weighted: 0.8897 - Recall Samples: 0.8922 - F1-score: 0.8464 ## Model description This model is trained for performing **Multi Label Sector Classification**. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6.9e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 200 - num_epochs: 8 - weight_decay: 0.001 - gradient_acumulation_steps: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision Micro | Precision Weighted | Precision Samples | Recall Micro | Recall Weighted | Recall Samples | F1-score | |:-------------:|:-----:|:----:|:---------------:|:---------------:|:------------------:|:-----------------:|:------------:|:---------------:|:--------------:|:--------:| | 0.4478 | 1.0 | 897 | 0.2277 | 0.6731 | 0.7183 | 0.7460 | 0.8822 | 0.8822 | 0.8989 | 0.7871 | | 0.2241 | 2.0 | 1794 | 0.1862 | 0.7088 | 0.7485 | 0.7754 | 0.8933 | 0.8933 | 0.9110 | 0.8108 | | 0.1647 | 3.0 | 2691 | 0.2025 | 0.6785 | 0.7023 | 0.7634 | 0.9124 | 0.9124 | 0.9252 | 0.8077 | | 0.1232 | 4.0 | 3588 | 0.1839 | 0.7274 | 0.7322 | 0.7976 | 0.9029 | 0.9029 | 0.9134 | 0.8286 | | 0.0899 | 5.0 | 4485 | 0.1889 | 0.7919 | 0.8007 | 0.8350 | 0.8909 | 0.8909 | 0.9060 | 0.8483 | | 0.0653 | 6.0 | 5382 | 0.2039 | 0.7478 | 0.7544 | 0.8098 | 0.8973 | 0.8973 | 0.9114 | 0.8346 | | 0.0462 | 7.0 | 6279 | 0.2149 | 0.7447 | 0.7500 | 0.8060 | 0.8989 | 0.8989 | 0.9107 | 0.8323 | | 0.0336 | 8.0 | 7176 | 0.2181 | 0.7733 | 0.7780 | 0.8221 | 0.8909 | 0.8909 | 0.9031 | 0.8400 | ## Environmental Impact *Carbon emissions were estimated using the [codecarbon](https://github.com/mlco2/codecarbon)* - **Hardware Type:** 16GB T4 - **Hours used:** 3 - **Cloud Provider:** Google Colab - **Carbon Emitted** : 0.276132 ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.1+cu118 - Datasets 2.12.0 - Tokenizers 0.13.3