Skip to content

Output truncation #602

@darrinh

Description

@darrinh

How do I extend the output of llama-2-13b-chat.ggmlv3.q2_K.bin when using llama-cpp-python? For example if prompt with:

How to make an egg omelette?

The output gets cut off:

`Step 1: Gather your ingredients and tools needed for making an egg omelette. You will need eggs, salt, pepper, butter or oil, and a non-stick pan or skillet. Make sure you have all these ingredients before moving on to step 2.

Step 2: Crack the eggs in a bowl and beat them with a fork until they are well mixed. Add salt and pepper to taste and mix well.

Step 3: Heat the pan or skillet over medium-low heat. When the butter or oil is melted, pour the egg mixture into the pan.

Step 4: Let the eggs cook for about 2-3 minutes until the edges start to set and the center still looks runny. Use a spatula to gently push the cooked eggs towards the center of the pan while tilting the pan so the uncooked egg can flow to the edges.

Step 5: After another minute or so, the eggs should now be almost fully set, but still moist and slightly jiggly in the center. Use a spatula to carefully fold one half of the`

How to get the rest of the output?

thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions