My own meta-try from the textsynth.org site, using its own description as input:
Text completion using the GPT-2 language model. It is a neural network of 1.5 billion parameters. Type a text and let the neural network complete it. Each try returns a different randomly chosen completion.
The system is still far from being perfect. I need to add a bit of support for parsing and formatting of HTML, CSS, and JSON.
If anyone has ideas on what could go in the UI, I am open to suggestions.
and wait a while for results to appear. You can use the --progress ( -- ) flag to tell the training of the system to continue, or use the --stop ( -- ) flag to stop training. If you run with no options, the system will continue on its current task.
Options:
--input text is the text to be trained --output text is the path to the text as output --progress will print a summary of the progress of the system every 15 seconds --stop will stop training the system and print a summary of
Text completion using the GPT-2 language model. It is a neural network of 1.5 billion parameters. Type a text and let the neural network complete it. Each try returns a different randomly chosen completion. The system is still far from being perfect. I need to add a bit of support for parsing and formatting of HTML, CSS, and JSON.
If anyone has ideas on what could go in the UI, I am open to suggestions.
Usage:
Run a text with the command:
$ python train-pct2.py \ [text] \ [options] \ [data]
and wait a while for results to appear. You can use the --progress ( -- ) flag to tell the training of the system to continue, or use the --stop ( -- ) flag to stop training. If you run with no options, the system will continue on its current task.
Options:
--input text is the text to be trained --output text is the path to the text as output --progress will print a summary of the progress of the system every 15 seconds --stop will stop training the system and print a summary of