MarkovChains

From ThorxWiki
(Difference between revisions)
Jump to: navigation, search
m (moved stuff to quotes)
 
(2 intermediate revisions by one user not shown)
Line 1: Line 1:
 
A term from the study of Probability.
 
A term from the study of Probability.
   
(Named after [[AndreiMarkov]]) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.
+
(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.
   
 
This enables both learning of patterns, and "replays" of existing and novel patterns. Note that the pattern probably has to be linear, but that's about it.
 
This enables both learning of patterns, and "replays" of existing and novel patterns. Note that the pattern probably has to be linear, but that's about it.
Line 10: Line 10:
   
 
{{quote|center|Roses are red.
 
{{quote|center|Roses are red.
Violets are blue|}}
+
Violets are blue| }}
   
 
If you ask the Markov chainer to play back a sentence beginning with "Roses", the first word will obviously be "Roses". In the two sentences above, there is only one word that ever is used after "Roses", which is "are". However, in the example sentences, there are *two* possible ways to follow on from "are" - "red", and "blue" - therefore the Markov chainer will pick one at random.
 
If you ask the Markov chainer to play back a sentence beginning with "Roses", the first word will obviously be "Roses". In the two sentences above, there is only one word that ever is used after "Roses", which is "are". However, in the example sentences, there are *two* possible ways to follow on from "are" - "red", and "blue" - therefore the Markov chainer will pick one at random.
Line 16: Line 16:
 
If we add a third sentence, say:
 
If we add a third sentence, say:
   
{{quote|center|Roses feed aphids|}}
+
{{quote|center|Roses feed aphids| }}
   
 
and generate a sentence beginning "Roses", then we have two choices for the second word. If the chainer chooses "are", we proceed as above. If the chainer chooses "feed", the only possible next word is "aphids."
 
and generate a sentence beginning "Roses", then we have two choices for the second word. If the chainer chooses "are", we proceed as above. If the chainer chooses "feed", the only possible next word is "aphids."
   
Additional and more complex statements can be generated with a larger initial corpus. As an exercise for the reader, consider what would happen if a Markov chainer were fed a long series of definitions of the form "''x'' is ''y''", and Gertrude Stein's famous edict that "A rose is a rose is a rose".
+
Additional and more complex statements can be generated with a larger initial corpus. As an exercise for the reader, consider what would happen if a Markov chainer were fed a long series of definitions of the form "''x'' is ''y''", and Gertrude Stein's famous edict that "A rose is a rose is a rose". More human-realistic statements can also be generated by considering more than one preceeding word for the next possible.
   
 
An implementation of these ideas might one be the [[IRCBot]] [[MB-01]].
 
An implementation of these ideas might one be the [[IRCBot]] [[MB-01]].

Latest revision as of 21:45, 10 September 2014

A term from the study of Probability.

(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

This enables both learning of patterns, and "replays" of existing and novel patterns. Note that the pattern probably has to be linear, but that's about it.

A predominant use of Markov Chains is learning and replaying English grammar, so I'll take my example from that domain:

Imagine two sentences, fed to a Markov chainer:

Roses are red.

Violets are blue

If you ask the Markov chainer to play back a sentence beginning with "Roses", the first word will obviously be "Roses". In the two sentences above, there is only one word that ever is used after "Roses", which is "are". However, in the example sentences, there are *two* possible ways to follow on from "are" - "red", and "blue" - therefore the Markov chainer will pick one at random.

If we add a third sentence, say:

Roses feed aphids

and generate a sentence beginning "Roses", then we have two choices for the second word. If the chainer chooses "are", we proceed as above. If the chainer chooses "feed", the only possible next word is "aphids."

Additional and more complex statements can be generated with a larger initial corpus. As an exercise for the reader, consider what would happen if a Markov chainer were fed a long series of definitions of the form "x is y", and Gertrude Stein's famous edict that "A rose is a rose is a rose". More human-realistic statements can also be generated by considering more than one preceeding word for the next possible.

An implementation of these ideas might one be the IRCBot MB-01.

Personal tools
Namespaces

Variants
Actions
Navigation
meta navigation
More thorx
Tools