skit_pipelines.components.sample_conversations_generator package

Submodules

skit_pipelines.components.sample_conversations_generator.utils module

run_conda_python_command(command)[source]

Module contents

sample_conversations_generator(output_path: <kfp.components._python_op.OutputPath object at 0x10987b880>, filename: str, prompt_file_path: str, n_iter: int, n_choice: int, temperature: float, model: str, llm_trainer_repo_name: str, llm_trainer_repo_branch: str, situation_file_path: str = '', situations: typing.Optional[str] = None)[source]
Parameters
  • situations (str, Optional) – situations list

  • output_dir (str) – The output directory where the generated conversations gets stored

  • filename (str) – Acts as a prefix to the default naming used for file

  • prompt_file_path (str) – Path to the file where prompt for data generation exists

  • n_iter (int) – No of times we make iterate on scenarios list to generate conversations

  • n_choice (int) – No of convs generated in a single time from a scenario.

  • temperature (float) – Temperature

  • model – Model to be used for generating data

t:ype model: str

Parameters
  • llm_trainer_repo_name (str) – The conversation generation repo name in Github.

  • llm_trainer_repo_branch (str, optional) – The branch name in the conversation generation repo to use , defaults to main.

output: path of the txt file where conversations is stored

sample_conversations_generator_op(filename: str, prompt_file_path: str, n_iter: int, n_choice: int, temperature: float, model: str, llm_trainer_repo_name: str, llm_trainer_repo_branch: str, situation_file_path: str = '', situations: str = None)

Sample conversations generator