Shashwat13333 commited on
Commit
dac0a9c
·
verified ·
1 Parent(s): f53b76e

Model save

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,895 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:150
11
+ - loss:MatryoshkaLoss
12
+ - loss:MultipleNegativesRankingLoss
13
+ base_model: BAAI/bge-base-en-v1.5
14
+ widget:
15
+ - source_sentence: Do you provide support 24/7?
16
+ sentences:
17
+ - 'How can we get started with your DevOps solutions?
18
+
19
+ Getting started is easy. Contact us through our website. We''ll schedule a consultation
20
+ to discuss your needs, evaluate your current infrastructure, and propose a customized
21
+ DevOps solution designed to achieve your goals.'
22
+ - 'This is our Portfolio
23
+
24
+ Introducing the world of Housing Finance& Banking Firm.
25
+
26
+ Corporate Website with 10 regional languages in India with analytics and user
27
+ personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage
28
+ the Builder Requests, approve/deny Properties, manage visits and appointments,
29
+ manage leads, etc.
30
+
31
+
32
+
33
+ Introducing the world of Global Automotive Brand.We have implemented a Multi Locale
34
+ Multilingual Omnichannel platform for Royal Enfield. The platform supports public
35
+ websites, customer portals, internal portals, business applications for over 35+
36
+ different locations all over the world.
37
+
38
+
39
+ Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML
40
+ in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops &
41
+ Data Governance
42
+
43
+ Managing cloud provisioning and modernization alongside automated infrastructure,
44
+ event-driven microservices, containerization, DevOps, cybersecurity, and 24x7
45
+ monitoring support ensures efficient, secure, and responsive IT operations.'
46
+ - "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\
47
+ \ in comprehensive website audits that provide valuable insights and recommendations\
48
+ \ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\
49
+ \ roadmap that transform your digital enterprise and produce a return on investment,\
50
+ \ basis our discovery framework, brainstorming sessions & current state analysis.\n\
51
+ \nPlatform Selection\nHelping you select the optimal digital experience, commerce,\
52
+ \ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\
53
+ \ next-gen scalable and agile enterprise digital platforms, along with multi-platform\
54
+ \ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\
55
+ \ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\
56
+ \ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\
57
+ \ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\
58
+ \ applications, data, and IT workloads, along with Application maintenance and\
59
+ \ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\
60
+ \ team to solve your hiring challenges with our easy to deploy staff augmentation\
61
+ \ offerings.\""
62
+ - source_sentence: What services do you offer for AI adoption?
63
+ sentences:
64
+ - 'In what ways can machine learning optimize our operations?
65
+
66
+ Machine learning algorithms can analyze operational data to identify inefficiencies,
67
+ predict maintenance needs, optimize supply chains, and automate repetitive tasks,
68
+ significantly improving operational efficiency and reducing costs.'
69
+ - " We specialize in guiding companies through the complexities of adopting and\
70
+ \ integrating Artificial Intelligence and Machine Learning technologies. Our consultancy\
71
+ \ services are designed to enhance your operational efficiency and decision-making\
72
+ \ capabilities across all sectors. With a global network of AI/ML experts and\
73
+ \ a commitment to excellence, we are your partners in transforming innovative\
74
+ \ possibilities into real-world achievements. \
75
+ \ \
76
+ \ \n DATA INTELLIGENCE PLATFORMS we specialize\
77
+ \ in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\""
78
+ - 'We are a New breed of innovative digital transformation agency, redefining storytelling
79
+ for an always-on world.
80
+
81
+ With roots dating back to 2017, we started as a pocket size team of enthusiasts
82
+ with a goal of helping traditional businesses transform and create dynamic, digital
83
+ cultures through disruptive strategies and agile deployment of innovative solutions.'
84
+ - source_sentence: What kind of data do you leverage for AI solutions?
85
+ sentences:
86
+ - 'How do we do Custom Development ?
87
+
88
+ We follow below process to develop custom web or mobile Application on Agile Methodology,
89
+ breaking requirements in pieces and developing and shipping them with considering
90
+ utmost quality:
91
+
92
+ Requirements Analysis
93
+
94
+ We begin by understanding the clients needs and objectives for the website. Identify
95
+ key features, functionality, and any specific design preferences.
96
+
97
+
98
+ Project Planning
99
+
100
+ Then create a detailed project plan outlining the scope, timeline, and milestones.
101
+ Define the technology stack and development tools suitable for the project.
102
+
103
+
104
+ User Experience Design
105
+
106
+ Then comes the stage of Developing wireframes or prototypes to visualize the website's
107
+ structure and layout. We create a custom design that aligns with the brand identity
108
+ and user experience goals.
109
+
110
+
111
+ Development
112
+
113
+ After getting Sign-off on Design from Client, we break the requirements into Sprints
114
+ on Agile Methodology, and start developing them.'
115
+ - Our AI/ML services pave the way for transformative change across industries, embodying
116
+ a client-focused approach that integrates seamlessly with human-centric innovation.
117
+ Our collaborative teams are dedicated to fostering growth, leveraging data, and
118
+ harnessing the predictive power of artificial intelligence to forge the next wave
119
+ of software excellence. We don't just deliver AI; we deliver the future.
120
+ - 'Why do we need Microservices ?
121
+
122
+ Instead of building a monolithic application where all functionalities are tightly
123
+ integrated, microservices break down the system into modular and loosely coupled
124
+ services.
125
+
126
+
127
+ Scalability
128
+
129
+ Flexibility and Agility
130
+
131
+ Resilience and Fault Isolation
132
+
133
+ Technology Diversity
134
+
135
+ Continuous Delivery'
136
+ - source_sentence: What challenges did the company face in its early days?
137
+ sentences:
138
+ - 'Our Solutions
139
+
140
+ Strategy & Digital Transformation
141
+
142
+ Innovate via digital transformation, modernize tech, craft product strategies,
143
+ enhance customer experiences, optimize data analytics, transition to cloud for
144
+ growth and efficiency
145
+
146
+
147
+ Product Engineering & Custom Development
148
+
149
+ Providing product development, enterprise web and mobile development, microservices
150
+ integrations, quality engineering, and application support services to drive innovation
151
+ and enhance operational efficiency.'
152
+ - 'What makes your DevOps solutions stand out from the competition?
153
+
154
+ Our DevOps solutions stand out due to our personalized approach, extensive expertise,
155
+ and commitment to innovation. We focus on delivering measurable results, such
156
+ as reduced deployment times, improved system reliability, and enhanced security,
157
+ ensuring you get the maximum benefit from our services.'
158
+ - 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal
159
+ decision to depart from the corporate ladder in December 2016. Fueled by a clear
160
+ vision to revolutionize the digital landscape, Mayank set out to leverage the
161
+ best technology ingredients, crafting custom applications and digital ecosystems
162
+ tailored to clients'' specific needs, limitations, and budgets.
163
+
164
+
165
+ However, this solo journey was not without its challenges. Mayank had to initiate
166
+ the revenue engine by offering corporate trainings and conducting online batches
167
+ for tech training across the USA. He also undertook small projects and subcontracted
168
+ modules of larger projects for clients in the US, UK, and India. It was only after
169
+ this initial groundwork that Mayank was able to hire a group of interns, whom
170
+ he meticulously trained and groomed to prepare them for handling Enterprise Level
171
+ Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial
172
+ spirit in building TechChefz Digital from the ground up.
173
+
174
+
175
+ With a passion for innovation and a relentless drive for excellence, Mayank has
176
+ steered TechChefz Digital through strategic partnerships, groundbreaking projects,
177
+ and exponential growth. His leadership has been instrumental in shaping the company
178
+ into a leading force in the digital transformation arena, inspiring a culture
179
+ of innovation and excellence that continues to propel the company forward.'
180
+ - source_sentence: What do you guys do for digital strategy?
181
+ sentences:
182
+ - " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\
183
+ \ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\
184
+ Helping you select the optimal digital experience, commerce, cloud and marketing\
185
+ \ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\
186
+ \ and agile enterprise digital platforms, along with multi-platform integrations.\n\
187
+ \nProduct Builds\nHelp you ideate, strategize, and engineer your product with\
188
+ \ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\
189
+ \ augment your existing team to solve your hiring challenges with our easy to\
190
+ \ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\
191
+ \ your business-critical applications, data, and IT workloads, along with Application\
192
+ \ maintenance and operations\n"
193
+ - 'Introducing the world of General Insurance Firm
194
+
195
+ In this project, we implemented Digital Solution and Implementation with Headless
196
+ Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the
197
+ following features:
198
+
199
+ PWA & AMP based Web Pages
200
+
201
+ Page Speed Optimization
202
+
203
+ Reusable and scalable React JS / Next JS Templates and Components
204
+
205
+ Headless Drupal CMS with Content & Experience management, approval workflows,
206
+ etc for seamless collaboration between the business and marketing teams
207
+
208
+ Minimalistic Buy and Renewal Journeys for various products, with API integrations
209
+ and adherence to data compliances
210
+
211
+
212
+ We achieved 250% Reduction in Operational Time and Effort in managing the Content
213
+ & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during
214
+ buy and renewal journeys, 300% Reduction in bounce rate on policy landing and
215
+ campaign pages'
216
+ - 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions
217
+ for Complex Problems and delieverd a comprehensive Website Development, Production
218
+ Support & Managed Services, we optimized customer journeys, integrate analytics,
219
+ CRM, ERP, and third-party applications, and implement cutting-edge technologies
220
+ for enhanced performance and efficiency
221
+
222
+ and achievied 200% Reduction in operational time & effort managing content & experience,
223
+ 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion
224
+ & Retention'
225
+ pipeline_tag: sentence-similarity
226
+ library_name: sentence-transformers
227
+ metrics:
228
+ - cosine_accuracy@1
229
+ - cosine_accuracy@3
230
+ - cosine_accuracy@5
231
+ - cosine_accuracy@10
232
+ - cosine_precision@1
233
+ - cosine_precision@3
234
+ - cosine_precision@5
235
+ - cosine_precision@10
236
+ - cosine_recall@1
237
+ - cosine_recall@3
238
+ - cosine_recall@5
239
+ - cosine_recall@10
240
+ - cosine_ndcg@10
241
+ - cosine_mrr@10
242
+ - cosine_map@100
243
+ model-index:
244
+ - name: BGE base Financial Matryoshka
245
+ results:
246
+ - task:
247
+ type: information-retrieval
248
+ name: Information Retrieval
249
+ dataset:
250
+ name: dim 768
251
+ type: dim_768
252
+ metrics:
253
+ - type: cosine_accuracy@1
254
+ value: 0.24
255
+ name: Cosine Accuracy@1
256
+ - type: cosine_accuracy@3
257
+ value: 0.6
258
+ name: Cosine Accuracy@3
259
+ - type: cosine_accuracy@5
260
+ value: 0.6933333333333334
261
+ name: Cosine Accuracy@5
262
+ - type: cosine_accuracy@10
263
+ value: 0.8266666666666667
264
+ name: Cosine Accuracy@10
265
+ - type: cosine_precision@1
266
+ value: 0.24
267
+ name: Cosine Precision@1
268
+ - type: cosine_precision@3
269
+ value: 0.19999999999999998
270
+ name: Cosine Precision@3
271
+ - type: cosine_precision@5
272
+ value: 0.13866666666666666
273
+ name: Cosine Precision@5
274
+ - type: cosine_precision@10
275
+ value: 0.08266666666666665
276
+ name: Cosine Precision@10
277
+ - type: cosine_recall@1
278
+ value: 0.24
279
+ name: Cosine Recall@1
280
+ - type: cosine_recall@3
281
+ value: 0.6
282
+ name: Cosine Recall@3
283
+ - type: cosine_recall@5
284
+ value: 0.6933333333333334
285
+ name: Cosine Recall@5
286
+ - type: cosine_recall@10
287
+ value: 0.8266666666666667
288
+ name: Cosine Recall@10
289
+ - type: cosine_ndcg@10
290
+ value: 0.5168483575362663
291
+ name: Cosine Ndcg@10
292
+ - type: cosine_mrr@10
293
+ value: 0.41823809523809513
294
+ name: Cosine Mrr@10
295
+ - type: cosine_map@100
296
+ value: 0.42416446853284084
297
+ name: Cosine Map@100
298
+ - task:
299
+ type: information-retrieval
300
+ name: Information Retrieval
301
+ dataset:
302
+ name: dim 512
303
+ type: dim_512
304
+ metrics:
305
+ - type: cosine_accuracy@1
306
+ value: 0.14666666666666667
307
+ name: Cosine Accuracy@1
308
+ - type: cosine_accuracy@3
309
+ value: 0.6
310
+ name: Cosine Accuracy@3
311
+ - type: cosine_accuracy@5
312
+ value: 0.6666666666666666
313
+ name: Cosine Accuracy@5
314
+ - type: cosine_accuracy@10
315
+ value: 0.8133333333333334
316
+ name: Cosine Accuracy@10
317
+ - type: cosine_precision@1
318
+ value: 0.14666666666666667
319
+ name: Cosine Precision@1
320
+ - type: cosine_precision@3
321
+ value: 0.2
322
+ name: Cosine Precision@3
323
+ - type: cosine_precision@5
324
+ value: 0.1333333333333333
325
+ name: Cosine Precision@5
326
+ - type: cosine_precision@10
327
+ value: 0.0813333333333333
328
+ name: Cosine Precision@10
329
+ - type: cosine_recall@1
330
+ value: 0.14666666666666667
331
+ name: Cosine Recall@1
332
+ - type: cosine_recall@3
333
+ value: 0.6
334
+ name: Cosine Recall@3
335
+ - type: cosine_recall@5
336
+ value: 0.6666666666666666
337
+ name: Cosine Recall@5
338
+ - type: cosine_recall@10
339
+ value: 0.8133333333333334
340
+ name: Cosine Recall@10
341
+ - type: cosine_ndcg@10
342
+ value: 0.47000090388642707
343
+ name: Cosine Ndcg@10
344
+ - type: cosine_mrr@10
345
+ value: 0.36049735449735437
346
+ name: Cosine Mrr@10
347
+ - type: cosine_map@100
348
+ value: 0.3666672731277011
349
+ name: Cosine Map@100
350
+ - task:
351
+ type: information-retrieval
352
+ name: Information Retrieval
353
+ dataset:
354
+ name: dim 256
355
+ type: dim_256
356
+ metrics:
357
+ - type: cosine_accuracy@1
358
+ value: 0.21333333333333335
359
+ name: Cosine Accuracy@1
360
+ - type: cosine_accuracy@3
361
+ value: 0.56
362
+ name: Cosine Accuracy@3
363
+ - type: cosine_accuracy@5
364
+ value: 0.6533333333333333
365
+ name: Cosine Accuracy@5
366
+ - type: cosine_accuracy@10
367
+ value: 0.76
368
+ name: Cosine Accuracy@10
369
+ - type: cosine_precision@1
370
+ value: 0.21333333333333335
371
+ name: Cosine Precision@1
372
+ - type: cosine_precision@3
373
+ value: 0.18666666666666668
374
+ name: Cosine Precision@3
375
+ - type: cosine_precision@5
376
+ value: 0.13066666666666663
377
+ name: Cosine Precision@5
378
+ - type: cosine_precision@10
379
+ value: 0.07599999999999998
380
+ name: Cosine Precision@10
381
+ - type: cosine_recall@1
382
+ value: 0.21333333333333335
383
+ name: Cosine Recall@1
384
+ - type: cosine_recall@3
385
+ value: 0.56
386
+ name: Cosine Recall@3
387
+ - type: cosine_recall@5
388
+ value: 0.6533333333333333
389
+ name: Cosine Recall@5
390
+ - type: cosine_recall@10
391
+ value: 0.76
392
+ name: Cosine Recall@10
393
+ - type: cosine_ndcg@10
394
+ value: 0.4826639910228885
395
+ name: Cosine Ndcg@10
396
+ - type: cosine_mrr@10
397
+ value: 0.3937354497354497
398
+ name: Cosine Mrr@10
399
+ - type: cosine_map@100
400
+ value: 0.40352633551410066
401
+ name: Cosine Map@100
402
+ - task:
403
+ type: information-retrieval
404
+ name: Information Retrieval
405
+ dataset:
406
+ name: dim 128
407
+ type: dim_128
408
+ metrics:
409
+ - type: cosine_accuracy@1
410
+ value: 0.16
411
+ name: Cosine Accuracy@1
412
+ - type: cosine_accuracy@3
413
+ value: 0.5066666666666667
414
+ name: Cosine Accuracy@3
415
+ - type: cosine_accuracy@5
416
+ value: 0.6533333333333333
417
+ name: Cosine Accuracy@5
418
+ - type: cosine_accuracy@10
419
+ value: 0.7333333333333333
420
+ name: Cosine Accuracy@10
421
+ - type: cosine_precision@1
422
+ value: 0.16
423
+ name: Cosine Precision@1
424
+ - type: cosine_precision@3
425
+ value: 0.1688888888888889
426
+ name: Cosine Precision@3
427
+ - type: cosine_precision@5
428
+ value: 0.13066666666666665
429
+ name: Cosine Precision@5
430
+ - type: cosine_precision@10
431
+ value: 0.0733333333333333
432
+ name: Cosine Precision@10
433
+ - type: cosine_recall@1
434
+ value: 0.16
435
+ name: Cosine Recall@1
436
+ - type: cosine_recall@3
437
+ value: 0.5066666666666667
438
+ name: Cosine Recall@3
439
+ - type: cosine_recall@5
440
+ value: 0.6533333333333333
441
+ name: Cosine Recall@5
442
+ - type: cosine_recall@10
443
+ value: 0.7333333333333333
444
+ name: Cosine Recall@10
445
+ - type: cosine_ndcg@10
446
+ value: 0.42870835906079113
447
+ name: Cosine Ndcg@10
448
+ - type: cosine_mrr@10
449
+ value: 0.33153439153439146
450
+ name: Cosine Mrr@10
451
+ - type: cosine_map@100
452
+ value: 0.3424976196127222
453
+ name: Cosine Map@100
454
+ - task:
455
+ type: information-retrieval
456
+ name: Information Retrieval
457
+ dataset:
458
+ name: dim 64
459
+ type: dim_64
460
+ metrics:
461
+ - type: cosine_accuracy@1
462
+ value: 0.13333333333333333
463
+ name: Cosine Accuracy@1
464
+ - type: cosine_accuracy@3
465
+ value: 0.38666666666666666
466
+ name: Cosine Accuracy@3
467
+ - type: cosine_accuracy@5
468
+ value: 0.4666666666666667
469
+ name: Cosine Accuracy@5
470
+ - type: cosine_accuracy@10
471
+ value: 0.6933333333333334
472
+ name: Cosine Accuracy@10
473
+ - type: cosine_precision@1
474
+ value: 0.13333333333333333
475
+ name: Cosine Precision@1
476
+ - type: cosine_precision@3
477
+ value: 0.1288888888888889
478
+ name: Cosine Precision@3
479
+ - type: cosine_precision@5
480
+ value: 0.09333333333333334
481
+ name: Cosine Precision@5
482
+ - type: cosine_precision@10
483
+ value: 0.06933333333333333
484
+ name: Cosine Precision@10
485
+ - type: cosine_recall@1
486
+ value: 0.13333333333333333
487
+ name: Cosine Recall@1
488
+ - type: cosine_recall@3
489
+ value: 0.38666666666666666
490
+ name: Cosine Recall@3
491
+ - type: cosine_recall@5
492
+ value: 0.4666666666666667
493
+ name: Cosine Recall@5
494
+ - type: cosine_recall@10
495
+ value: 0.6933333333333334
496
+ name: Cosine Recall@10
497
+ - type: cosine_ndcg@10
498
+ value: 0.38361272138781966
499
+ name: Cosine Ndcg@10
500
+ - type: cosine_mrr@10
501
+ value: 0.28834391534391535
502
+ name: Cosine Mrr@10
503
+ - type: cosine_map@100
504
+ value: 0.30056764135792025
505
+ name: Cosine Map@100
506
+ ---
507
+
508
+ # BGE base Financial Matryoshka
509
+
510
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
511
+
512
+ ## Model Details
513
+
514
+ ### Model Description
515
+ - **Model Type:** Sentence Transformer
516
+ - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
517
+ - **Maximum Sequence Length:** 512 tokens
518
+ - **Output Dimensionality:** 768 dimensions
519
+ - **Similarity Function:** Cosine Similarity
520
+ <!-- - **Training Dataset:** Unknown -->
521
+ - **Language:** en
522
+ - **License:** apache-2.0
523
+
524
+ ### Model Sources
525
+
526
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
527
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
528
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
529
+
530
+ ### Full Model Architecture
531
+
532
+ ```
533
+ SentenceTransformer(
534
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
535
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
536
+ (2): Normalize()
537
+ )
538
+ ```
539
+
540
+ ## Usage
541
+
542
+ ### Direct Usage (Sentence Transformers)
543
+
544
+ First install the Sentence Transformers library:
545
+
546
+ ```bash
547
+ pip install -U sentence-transformers
548
+ ```
549
+
550
+ Then you can load this model and run inference.
551
+ ```python
552
+ from sentence_transformers import SentenceTransformer
553
+
554
+ # Download from the 🤗 Hub
555
+ model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5_v3")
556
+ # Run inference
557
+ sentences = [
558
+ 'What do you guys do for digital strategy?',
559
+ ' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
560
+ 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency\nand achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention',
561
+ ]
562
+ embeddings = model.encode(sentences)
563
+ print(embeddings.shape)
564
+ # [3, 768]
565
+
566
+ # Get the similarity scores for the embeddings
567
+ similarities = model.similarity(embeddings, embeddings)
568
+ print(similarities.shape)
569
+ # [3, 3]
570
+ ```
571
+
572
+ <!--
573
+ ### Direct Usage (Transformers)
574
+
575
+ <details><summary>Click to see the direct usage in Transformers</summary>
576
+
577
+ </details>
578
+ -->
579
+
580
+ <!--
581
+ ### Downstream Usage (Sentence Transformers)
582
+
583
+ You can finetune this model on your own dataset.
584
+
585
+ <details><summary>Click to expand</summary>
586
+
587
+ </details>
588
+ -->
589
+
590
+ <!--
591
+ ### Out-of-Scope Use
592
+
593
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
594
+ -->
595
+
596
+ ## Evaluation
597
+
598
+ ### Metrics
599
+
600
+ #### Information Retrieval
601
+
602
+ * Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
603
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
604
+
605
+ | Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
606
+ |:--------------------|:-----------|:---------|:-----------|:-----------|:-----------|
607
+ | cosine_accuracy@1 | 0.24 | 0.1467 | 0.2133 | 0.16 | 0.1333 |
608
+ | cosine_accuracy@3 | 0.6 | 0.6 | 0.56 | 0.5067 | 0.3867 |
609
+ | cosine_accuracy@5 | 0.6933 | 0.6667 | 0.6533 | 0.6533 | 0.4667 |
610
+ | cosine_accuracy@10 | 0.8267 | 0.8133 | 0.76 | 0.7333 | 0.6933 |
611
+ | cosine_precision@1 | 0.24 | 0.1467 | 0.2133 | 0.16 | 0.1333 |
612
+ | cosine_precision@3 | 0.2 | 0.2 | 0.1867 | 0.1689 | 0.1289 |
613
+ | cosine_precision@5 | 0.1387 | 0.1333 | 0.1307 | 0.1307 | 0.0933 |
614
+ | cosine_precision@10 | 0.0827 | 0.0813 | 0.076 | 0.0733 | 0.0693 |
615
+ | cosine_recall@1 | 0.24 | 0.1467 | 0.2133 | 0.16 | 0.1333 |
616
+ | cosine_recall@3 | 0.6 | 0.6 | 0.56 | 0.5067 | 0.3867 |
617
+ | cosine_recall@5 | 0.6933 | 0.6667 | 0.6533 | 0.6533 | 0.4667 |
618
+ | cosine_recall@10 | 0.8267 | 0.8133 | 0.76 | 0.7333 | 0.6933 |
619
+ | **cosine_ndcg@10** | **0.5168** | **0.47** | **0.4827** | **0.4287** | **0.3836** |
620
+ | cosine_mrr@10 | 0.4182 | 0.3605 | 0.3937 | 0.3315 | 0.2883 |
621
+ | cosine_map@100 | 0.4242 | 0.3667 | 0.4035 | 0.3425 | 0.3006 |
622
+
623
+ <!--
624
+ ## Bias, Risks and Limitations
625
+
626
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
627
+ -->
628
+
629
+ <!--
630
+ ### Recommendations
631
+
632
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
633
+ -->
634
+
635
+ ## Training Details
636
+
637
+ ### Training Dataset
638
+
639
+ #### Unnamed Dataset
640
+
641
+ * Size: 150 training samples
642
+ * Columns: <code>anchor</code> and <code>positive</code>
643
+ * Approximate statistics based on the first 150 samples:
644
+ | | anchor | positive |
645
+ |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
646
+ | type | string | string |
647
+ | details | <ul><li>min: 7 tokens</li><li>mean: 11.97 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 125.49 tokens</li><li>max: 378 tokens</li></ul> |
648
+ * Samples:
649
+ | anchor | positive |
650
+ |:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
651
+ | <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
652
+ | <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
653
+ | <code>do you track customer behavior?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
654
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
655
+ ```json
656
+ {
657
+ "loss": "MultipleNegativesRankingLoss",
658
+ "matryoshka_dims": [
659
+ 768,
660
+ 512,
661
+ 256,
662
+ 128,
663
+ 64
664
+ ],
665
+ "matryoshka_weights": [
666
+ 1,
667
+ 1,
668
+ 1,
669
+ 1,
670
+ 1
671
+ ],
672
+ "n_dims_per_step": -1
673
+ }
674
+ ```
675
+
676
+ ### Training Hyperparameters
677
+ #### Non-Default Hyperparameters
678
+
679
+ - `eval_strategy`: epoch
680
+ - `gradient_accumulation_steps`: 4
681
+ - `learning_rate`: 1e-05
682
+ - `weight_decay`: 0.01
683
+ - `num_train_epochs`: 4
684
+ - `lr_scheduler_type`: cosine
685
+ - `warmup_ratio`: 0.1
686
+ - `fp16`: True
687
+ - `load_best_model_at_end`: True
688
+ - `optim`: adamw_torch_fused
689
+ - `push_to_hub`: True
690
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v3
691
+ - `push_to_hub_model_id`: bge-base-en-v1.5_v3
692
+ - `batch_sampler`: no_duplicates
693
+
694
+ #### All Hyperparameters
695
+ <details><summary>Click to expand</summary>
696
+
697
+ - `overwrite_output_dir`: False
698
+ - `do_predict`: False
699
+ - `eval_strategy`: epoch
700
+ - `prediction_loss_only`: True
701
+ - `per_device_train_batch_size`: 8
702
+ - `per_device_eval_batch_size`: 8
703
+ - `per_gpu_train_batch_size`: None
704
+ - `per_gpu_eval_batch_size`: None
705
+ - `gradient_accumulation_steps`: 4
706
+ - `eval_accumulation_steps`: None
707
+ - `torch_empty_cache_steps`: None
708
+ - `learning_rate`: 1e-05
709
+ - `weight_decay`: 0.01
710
+ - `adam_beta1`: 0.9
711
+ - `adam_beta2`: 0.999
712
+ - `adam_epsilon`: 1e-08
713
+ - `max_grad_norm`: 1.0
714
+ - `num_train_epochs`: 4
715
+ - `max_steps`: -1
716
+ - `lr_scheduler_type`: cosine
717
+ - `lr_scheduler_kwargs`: {}
718
+ - `warmup_ratio`: 0.1
719
+ - `warmup_steps`: 0
720
+ - `log_level`: passive
721
+ - `log_level_replica`: warning
722
+ - `log_on_each_node`: True
723
+ - `logging_nan_inf_filter`: True
724
+ - `save_safetensors`: True
725
+ - `save_on_each_node`: False
726
+ - `save_only_model`: False
727
+ - `restore_callback_states_from_checkpoint`: False
728
+ - `no_cuda`: False
729
+ - `use_cpu`: False
730
+ - `use_mps_device`: False
731
+ - `seed`: 42
732
+ - `data_seed`: None
733
+ - `jit_mode_eval`: False
734
+ - `use_ipex`: False
735
+ - `bf16`: False
736
+ - `fp16`: True
737
+ - `fp16_opt_level`: O1
738
+ - `half_precision_backend`: auto
739
+ - `bf16_full_eval`: False
740
+ - `fp16_full_eval`: False
741
+ - `tf32`: None
742
+ - `local_rank`: 0
743
+ - `ddp_backend`: None
744
+ - `tpu_num_cores`: None
745
+ - `tpu_metrics_debug`: False
746
+ - `debug`: []
747
+ - `dataloader_drop_last`: False
748
+ - `dataloader_num_workers`: 0
749
+ - `dataloader_prefetch_factor`: None
750
+ - `past_index`: -1
751
+ - `disable_tqdm`: False
752
+ - `remove_unused_columns`: True
753
+ - `label_names`: None
754
+ - `load_best_model_at_end`: True
755
+ - `ignore_data_skip`: False
756
+ - `fsdp`: []
757
+ - `fsdp_min_num_params`: 0
758
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
759
+ - `fsdp_transformer_layer_cls_to_wrap`: None
760
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
761
+ - `deepspeed`: None
762
+ - `label_smoothing_factor`: 0.0
763
+ - `optim`: adamw_torch_fused
764
+ - `optim_args`: None
765
+ - `adafactor`: False
766
+ - `group_by_length`: False
767
+ - `length_column_name`: length
768
+ - `ddp_find_unused_parameters`: None
769
+ - `ddp_bucket_cap_mb`: None
770
+ - `ddp_broadcast_buffers`: False
771
+ - `dataloader_pin_memory`: True
772
+ - `dataloader_persistent_workers`: False
773
+ - `skip_memory_metrics`: True
774
+ - `use_legacy_prediction_loop`: False
775
+ - `push_to_hub`: True
776
+ - `resume_from_checkpoint`: None
777
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v3
778
+ - `hub_strategy`: every_save
779
+ - `hub_private_repo`: None
780
+ - `hub_always_push`: False
781
+ - `gradient_checkpointing`: False
782
+ - `gradient_checkpointing_kwargs`: None
783
+ - `include_inputs_for_metrics`: False
784
+ - `include_for_metrics`: []
785
+ - `eval_do_concat_batches`: True
786
+ - `fp16_backend`: auto
787
+ - `push_to_hub_model_id`: bge-base-en-v1.5_v3
788
+ - `push_to_hub_organization`: None
789
+ - `mp_parameters`:
790
+ - `auto_find_batch_size`: False
791
+ - `full_determinism`: False
792
+ - `torchdynamo`: None
793
+ - `ray_scope`: last
794
+ - `ddp_timeout`: 1800
795
+ - `torch_compile`: False
796
+ - `torch_compile_backend`: None
797
+ - `torch_compile_mode`: None
798
+ - `dispatch_batches`: None
799
+ - `split_batches`: None
800
+ - `include_tokens_per_second`: False
801
+ - `include_num_input_tokens_seen`: False
802
+ - `neftune_noise_alpha`: None
803
+ - `optim_target_modules`: None
804
+ - `batch_eval_metrics`: False
805
+ - `eval_on_start`: False
806
+ - `use_liger_kernel`: False
807
+ - `eval_use_gather_object`: False
808
+ - `average_tokens_across_devices`: False
809
+ - `prompts`: None
810
+ - `batch_sampler`: no_duplicates
811
+ - `multi_dataset_batch_sampler`: proportional
812
+
813
+ </details>
814
+
815
+ ### Training Logs
816
+ | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
817
+ |:----------:|:-----:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
818
+ | 0.2105 | 1 | 24.6456 | - | - | - | - | - |
819
+ | 0.8421 | 4 | - | 0.4674 | 0.4603 | 0.4340 | 0.4308 | 0.3485 |
820
+ | 1.2105 | 5 | 19.7625 | - | - | - | - | - |
821
+ | **1.8421** | **8** | **-** | **0.4873** | **0.477** | **0.4744** | **0.4591** | **0.3772** |
822
+ | 2.4211 | 10 | 16.24 | - | - | - | - | - |
823
+ | 2.8421 | 12 | - | 0.5446 | 0.4638 | 0.4531 | 0.3992 | 0.3599 |
824
+ | 3.6316 | 15 | 14.3556 | - | - | - | - | - |
825
+ | 3.8421 | 16 | - | 0.5168 | 0.4700 | 0.4827 | 0.4287 | 0.3836 |
826
+
827
+ * The bold row denotes the saved checkpoint.
828
+
829
+ ### Framework Versions
830
+ - Python: 3.11.11
831
+ - Sentence Transformers: 3.4.1
832
+ - Transformers: 4.48.2
833
+ - PyTorch: 2.5.1+cu124
834
+ - Accelerate: 1.3.0
835
+ - Datasets: 3.2.0
836
+ - Tokenizers: 0.21.0
837
+
838
+ ## Citation
839
+
840
+ ### BibTeX
841
+
842
+ #### Sentence Transformers
843
+ ```bibtex
844
+ @inproceedings{reimers-2019-sentence-bert,
845
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
846
+ author = "Reimers, Nils and Gurevych, Iryna",
847
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
848
+ month = "11",
849
+ year = "2019",
850
+ publisher = "Association for Computational Linguistics",
851
+ url = "https://arxiv.org/abs/1908.10084",
852
+ }
853
+ ```
854
+
855
+ #### MatryoshkaLoss
856
+ ```bibtex
857
+ @misc{kusupati2024matryoshka,
858
+ title={Matryoshka Representation Learning},
859
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
860
+ year={2024},
861
+ eprint={2205.13147},
862
+ archivePrefix={arXiv},
863
+ primaryClass={cs.LG}
864
+ }
865
+ ```
866
+
867
+ #### MultipleNegativesRankingLoss
868
+ ```bibtex
869
+ @misc{henderson2017efficient,
870
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
871
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
872
+ year={2017},
873
+ eprint={1705.00652},
874
+ archivePrefix={arXiv},
875
+ primaryClass={cs.CL}
876
+ }
877
+ ```
878
+
879
+ <!--
880
+ ## Glossary
881
+
882
+ *Clearly define terms in order to be accessible across audiences.*
883
+ -->
884
+
885
+ <!--
886
+ ## Model Card Authors
887
+
888
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
889
+ -->
890
+
891
+ <!--
892
+ ## Model Card Contact
893
+
894
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
895
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.1",
4
+ "transformers": "4.48.2",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9aed894bd4ba5213c40f8bc0597a69443461d75fbf75f990eaa479c0926f33fd
3
  size 437951328
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:13cd8aae0278c0e8fb06f7a37c34c5db85466322d76769761041405b764f190e
3
  size 437951328
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": true
4
+ }