Self-Generated Training Examples is actually possible right now.

First on a Linux machine (usually Ubuntu, I use Mint):

sudo apt-get install ruby-full; sudo gem install gabbler; sudo gem install espeak-ruby

Now go into a text editor, and change your formatting to Ruby. Now create a simple gabbler file.

def gabble_with_examples
  require "gabbler"
  require "espeak"
  gabbler =
  data    ="data/examples.txt")
  phrase = gabbler.sentence
  speech =

Note that I place gabbler inside of a method, the reason is that you will call another method to generate the training examples. This should be placed above your gabble_with_examples method.

Now to create the grammar-generator method:

def grammar_generator
  def greeting
    hi    =    "Hi "
    hey   =   "Hey "
    hello = "Hello "
    heyo  =  "Heyo "
    ola   =   "Ola "
greeting_index = [
      hi, hey, heyo, hello, ola
$do_greeting = greeting_index.sample
def agent
    your_name ="identity/config_name.txt").strip
$do_agent = your_name
def prompt
    good_morning   = ", good morning!\n"
    how_are_you    = ", how are you?\n"
    you_doing_well = ", you doing well?\n"
    prompt_index = [
      good_morning, how_are_you, you_doing_well
  $do_prompt = prompt_index.sample
open("data/examples.txt", "w") { |f|
    9.times do
      g = greeting;
      g = agent;
      p = prompt;
      f.print $do_greeting
      f.print $do_agent
      f.print $do_prompt

Now simply call you methods: remember to call the grammar generator first:

g1 = grammar_generator; g2 = gabble_with_examples

So it's not actually that far off in the future to be able to create an algorithm that can generate its own training examples. It's more about how complex you want to write your code.

I usually use multiple machine learning algorithms: ex. Decision Trees are usually best as an input learning algorithm, while gabbler is usually better as a data sampling algorithm.