AI analyzed my coli posts (thanks to @bnew for the tutorial)

Fill Collins

I didn't mean to verge!
Joined
Jun 19, 2019
Messages
11,243
Reputation
3,224
Daps
35,038
So I got this, what about y'all? :ehh: :manny:

I looked up certain keywords like "drugs" "women" "politics", etc. and chose the first five





used the claude haiku model


you have to provide context and specify key terms and names in your prompts to reduce hallucinations.

edit:
partial screenshot of text
wP7CBZg.png
 

Fill Collins

I didn't mean to verge!
Joined
Jun 19, 2019
Messages
11,243
Reputation
3,224
Daps
35,038
all of that to have AI tell me what I already know about myself? :childplease:
Yes :ehh:

@Fill Collins do my shyt I’m technologically illiterate
Give me five+ keywords you'd want me to use, I say this because if you just used my feelings on non-Blacks you'd get this analysis :skip: :

In summary, "Fill Collins" appears to hold a highly radicalized, far-right, and anti-minority political ideology, with a strong Black nationalist bent and some anti-capitalist economic views. Their posts demonstrate a concerning level of hatred, bigotry, and extremism towards various religious and ethnic groups, particularly Muslims, Jews, and Indians

"Femdom"

Even the AI knows you a fxggot sissy that likes to get pegged @Fill Collins
Pegged? :mindblown:

The name's FILL not FILLED :dame:

Keep your fantasies to yourself :scust:
 

Apprentice

RIP Doughboy Roc
Joined
Oct 10, 2017
Messages
19,319
Reputation
4,806
Daps
88,160
Reppin
DMV
Yes :ehh:


Give me five+ keywords you'd want me to use, I say this because if you just used my feelings on non-Blacks you'd get this analysis :skip: :




Pegged? :mindblown:

The name's FILL not FILLED :dame:

Keep your fantasies to yourself :scust:
Try these key words

Common coli topics

Marriage
LeBron James
Student debt
Drake
Loyalty
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,335
Reputation
8,496
Daps
160,001
keep in mind it has it's limits.

sljT5oa.png

In the context of large language models, tokens refer to the basic units of text that the model processes and generates.

Specifically:

- Tokens are the individual words, numbers, punctuation marks, or other symbols that make up the input text or output text of the language model.

- The language model is trained on a large corpus of text data, which is broken down into these discrete tokens during the training process.

- The model learns patterns and relationships between these tokens, allowing it to understand the meaning and structure of language.

- When you provide input text to the language model, it processes that text by breaking it down into individual tokens that it can analyze and understand.

- Similarly, when the model generates new text, it does so by selecting and combining tokens in a way that produces coherent and meaningful output.

So in simple terms, tokens are the building blocks that the language model uses to comprehend and generate human language. They are the fundamental units of text that allow the model to perform its language-related tasks.


In the context of large language models like myself, the "context window" refers to the maximum number of tokens (words, characters, etc.) that the model can consider at one time when generating new text.

When you see that Claude-3-haiku has a context window of 200K, it means that the model can take into account up to 200,000 tokens of previous text when producing new output.

This is important because it determines how much contextual information the model can use to understand the meaning and flow of the text it's generating. A larger context window allows the model to better grasp the broader context and produce more coherent, relevant, and natural-sounding language.

In the case of Claude-3-haiku, the 200K context window gives the model a fairly large amount of preceding text to draw upon when generating new haiku poetry. This helps ensure the haikus flow logically and thematically from one line to the next.

So in summary, the 200K context window means this particular model version of Claude has the capacity to consider a substantial amount of prior text when producing new output, which is beneficial for tasks like creative writing and maintaining coherence over longer passages. The larger the context window, the more contextual information the model can leverage.
 
Last edited:
Top