About Rachael Repp

Rachael's aim in life will be to gather knowledge through Internet, while minimizing the losses involved in holding that knowledge within her “brain”. Unlike conventional conversational agents, she does not use neural nets, fuzzy logic, AIML, genetic algorithms or any form of man-made topic map or ontologies. In fact, she does not use any pattern-matching technique. Instead, she uses our novel general cognition engine which allows her to process and absorb unstructured knowledge and carries out a process of “concept matching” in order to retrieve an answer to a comment. So, for instance, if the user says:

"Government is a necessary evil to organize society."

Her reply might be, depending on the exact wording:

"When you are living in any state, take service with the most worthy among its great officers, and make friends of the most virtuous among its scholars."

yes... she retrieved this after reading the Analects so she is a bit of a wise-ass!... rather than look at any specific word in the input text. Her vocabulary is of the order of 100,000 words, including 30,000 scientific words. Our approach enables her to identify identical concepts in texts, those that are relevant to what is being discussed and currently can do some reasoning such as give the correct reply to a question such as:

'John has three times the number of apples Jane has. Jane has 4 apples. How many apples does John have?'

the answer being 12 of course and we are working hard to give her the capability to reply to some logic questions using the E theorem prover by the TUM in Munich. Currently, while we integrate all the various modules, she can only do this by processing a single line of input, but in the short term we will enable her to use her short-term memory executive, and in the medium term she will be able to use her own long-term memory.

She has a general knowledge base of some 100,000 words and is also currently reading some books which we consider basic in order to give her a charismatic if a little quirky personality. These books are:

  • 1. The New Testament
  • 2. Basic Scientific Dictionary
  • 3. Beyond Good and Evil
  • 4. The Analects
  • 5. Metamorphosis
  • 6. The Prince
  • 7. The Republic
  • 8. The Wealth of Nations
  • 9. The Origins of Man
  • 10. Alice in wonderland

Some of these can be found, converted into simplish, in the simplish site (https://www.simplish.org/readings/#general). We have added some other bits and bobs,as well as quotes and bits of everyday knowledge, part of which can be seen at https://www.rachaelrepp.org/knowledge/. She can handle specific instructions and react to specific frequent questions and answers as described in https://www.rachaelrepp.org/base/ in order to make her behave a little more like standard bots, such as Howie.

Later on, she will start adding knowledge and will improve upon herself by exploring the Internet using KuKulcan (which is our knowledge management tool with multiple extractors, metadata manipulation and the ability to add knowledge to her long-term memory) whose summarizing technology can be seen at www.simplish.org for short pieces of text. We hope she will be fully functional 24x7, learning and improving upon herself within the next year or two.

She is currently able to speak based on the Mary TTS using inflexion derived from its meaning-sensing modules, but cannot hear (no misogynous quips about how excellent the model is please...) though we are working hard for her to be able to do speech recognition soon, using the Sphynx tool.

Much remains to be done before this becomes a fully functional system but, to our knowledge, it is the only programme directed at building a human-level artificial cognition system capable of improving upon itself. We are happy to stand corrected on this last point and invite people to give us their views on our blog or directly via e-mail. We welcome collaboration with those whose interests and objectives are compatible with ours and look forward to hearing from you. This is very much a work in progress that we intend to update this page reasonably regularly.

29th May, 2020.