Deep Dive Into Datasets For Token Classification in Argilla : Start Annotating For NER NLP Tasks

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ต.ค. 2024

ความคิดเห็น • 3

  • @HarshGajera-f4f
    @HarshGajera-f4f 2 หลายเดือนก่อน

    Hi Kamalraj,
    I need your help with a project I'm working on. The project involves storing dataset records that come from different sources like Hugging Face, imports, etc. I have stored these dataset records on GCP, and now I need to send these records to Argilla.
    I noticed that to push records to Argilla, we need to convert the data into Argilla's data structure. The problem is that I don't have a specific record format because Hugging Face uses different record formats for different models. Could you please help me with this?
    Thank you!

  • @HarshGajera-f4f
    @HarshGajera-f4f 2 หลายเดือนก่อน

    Hi Kamalraj,
    I need your help with a project I'm working on. The project involves storing dataset records that come from different sources like Hugging Face, imports, etc. I have stored these dataset records on GCP, and now I need to send these records to Argilla.
    I noticed that to push records to Argilla, we need to convert the data into Argilla's data structure. The problem is that I don't have a specific record format because Hugging Face uses different record formats for different models. Could you please help me with this?
    Thank you!

    • @insightbuilder
      @insightbuilder  2 หลายเดือนก่อน

      @@HarshGajera-f4f
      I remember argilla has some integration with HF datasets, could you share your git repo I can review when time permits.