Update README.md
Browse files
README.md
CHANGED
@@ -67,6 +67,8 @@ The model is tailored for commercial use in Java programming tasks. It is partic
|
|
67 |
3. Code generation and completion tasks in Java.
|
68 |
4. FIM (code infilling) tasks specific to Java.
|
69 |
|
|
|
|
|
70 |
# How to Use
|
71 |
|
72 |
## Sample inference code
|
@@ -114,7 +116,8 @@ inputs = tokenizer.encode("public class HelloWorld {\n public static void mai
|
|
114 |
outputs = model.generate(inputs)
|
115 |
print(tokenizer.decode(outputs[0]))
|
116 |
```
|
117 |
-
|
|
|
118 |
# Attribution & Other Requirements
|
119 |
|
120 |
The pretraining dataset for the model was curated to include only data with permissive licenses. Despite this, the model is capable of generating source code verbatim from the dataset. The licenses of such code may necessitate attribution and adherence to other specific conditions. To facilitate compliance, we provide a [search index](https://huggingface.co/spaces/bigcode/search) that enables users to trace the origins of generated code within the pretraining data, allowing for proper attribution and adherence to licensing requirements.
|
@@ -122,7 +125,9 @@ The pretraining dataset for the model was curated to include only data with perm
|
|
122 |
# Limitations
|
123 |
|
124 |
The NT-Java-1.1B model has been trained on publicly available datasets and is offered without any safety guarantees. As with all language models, its outputs are inherently unpredictable, and the generated code may not perform as expected. Additionally, the code may be inefficient or contain bugs and security vulnerabilities. Consequently, it is imperative for users and developers to undertake extensive safety testing and to implement robust filtering mechanisms tailored to their specific needs.
|
125 |
-
|
|
|
|
|
126 |
# Training
|
127 |
|
128 |
## Model
|
@@ -145,6 +150,7 @@ The NT-Java-1.1B model has been trained on publicly available datasets and is of
|
|
145 |
|
146 |
# License
|
147 |
The model is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement [here](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
|
|
|
148 |
# Citation
|
149 |
```
|
150 |
@article{li2023starcoder,
|
|
|
67 |
3. Code generation and completion tasks in Java.
|
68 |
4. FIM (code infilling) tasks specific to Java.
|
69 |
|
70 |
+
<br>
|
71 |
+
|
72 |
# How to Use
|
73 |
|
74 |
## Sample inference code
|
|
|
116 |
outputs = model.generate(inputs)
|
117 |
print(tokenizer.decode(outputs[0]))
|
118 |
```
|
119 |
+
<br>
|
120 |
+
|
121 |
# Attribution & Other Requirements
|
122 |
|
123 |
The pretraining dataset for the model was curated to include only data with permissive licenses. Despite this, the model is capable of generating source code verbatim from the dataset. The licenses of such code may necessitate attribution and adherence to other specific conditions. To facilitate compliance, we provide a [search index](https://huggingface.co/spaces/bigcode/search) that enables users to trace the origins of generated code within the pretraining data, allowing for proper attribution and adherence to licensing requirements.
|
|
|
125 |
# Limitations
|
126 |
|
127 |
The NT-Java-1.1B model has been trained on publicly available datasets and is offered without any safety guarantees. As with all language models, its outputs are inherently unpredictable, and the generated code may not perform as expected. Additionally, the code may be inefficient or contain bugs and security vulnerabilities. Consequently, it is imperative for users and developers to undertake extensive safety testing and to implement robust filtering mechanisms tailored to their specific needs.
|
128 |
+
|
129 |
+
<br>
|
130 |
+
|
131 |
# Training
|
132 |
|
133 |
## Model
|
|
|
150 |
|
151 |
# License
|
152 |
The model is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement [here](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
|
153 |
+
|
154 |
# Citation
|
155 |
```
|
156 |
@article{li2023starcoder,
|