Default Rule & Pre-selector

Default Rule

The default rule is a concept that comes from the default production in Silver. It is not an actual rule (production) in the grammar, but rather a definition of elements (attribute implementation etc.) that are shared by all rules expanding from the same nonterminal type. The syntax for these (including the pre-selector) can be found at [Here].

    • Default rule is written before the expansion symbol := (but after the nonterminal name), unlike the actual rules. The Dungeon demo contains an example usage.

    • Default rule cannot have any children, because it is not an actual rule.

    • Current implementation of default rule cannot have associated custom configure parameters, in the future it may allow some, for representing nonterminal level custom configure parameters (those shared by all rules).

    • Default rule can have attribute implementations, both the assignment for variable attribute and the implementation of functional attribute. The semantics is equivalent to putting a copy of those in the implementation of each actual rules from the same nonterminal type (this would mean that any rule specify information is invalid in default rule, because it would not make sense in some other rule besides the specific one).

    • Default rule can also define generator, constructor, destructor parts that are shared between rules from the same nonterminal type. The keywords for generators and constructors are 'pregenerator', 'preconstructor' and 'pregencontor'. The keywords for destructors is 'postdestructor'.

      • The statements in the "pregenerator" blocks are inserted in front of the explicit implementation of node generators in the scope of all actual rules from the nonterminal type, after the implicit part that reads rule level custom configure parameters (see the figure at [Here]).

      • The statements in the "preconstructor" blocks are inserted in front of the explicit implementation of node constructors in the scope of all actual rules from the nonterminal type.

      • The "pregencontor" blocks defines shared statements in the pre-generator and the pre-constructor.

      • The statements in the "postdestructor" blocks are inserted after the explicit implementation of node destructors in the scope of all actual rules from the nonterminal type.

    • An important type of elements that appears in the rule block is the pre-selector, which is started by the preselector keyword, which defines statements that needs to be executed before the stochastic selection process, thus providing ways to directly constrain the stochastic selection process.

    • The interactions between multiple occurrences of those blocks (pre-generator, pre-constructor, pre-gencontor, post-destructor, pre-selector) follows the analogous conventions as those in actual rule scopes, explained at [Here]

Pre-selector

The pre-selector is a block of statements that are executed before the (implicit) stochastic selection process but after the (implicit) rule probability evaluation process (see the figure at [Here]) in the node generator. Because of the special position in the node generator, it is often used to manipulate the rule probabilities according to local conditions, before they are used in the the rule selection part. Therefore they provide another way, and a more direct way to control the stochastic content generation, in addition to the lambda configuration feature.

    • Nonterminal arguments are available in the pre-selector, because it is part of the node generator. For example, in the Dungeon demo the pre-selector forbids certain rules according to the size of the current sub-area ("range") and whether the current sub-area should contain a boss room ("have_boss").

    • Node attributes are not available in the pre-selector, because it is before the rule selection part which means that the node has not been created yet.

    • In additional to regular C++ statements, some special GIGL syntax are allowed in the pre-selector block.

    • probof(hDivide) is an expression referring to the probability of the rule named "hDivide". Here we refer to the rule with its name. This provides ways to explicitly control on rule probabilities in details, e.g. probof(hDivide) += 0.01 will add 0.01 to the rule probability of the rule "hDivide". This syntax is also used in referencing rule probabilities in generator configurations (see [Here]).

      • probof[i] is an expression referring to the probability of i'th rule from the nonterminal type, indexed from zero. This is just a variant of the syntax above where we refer to the rule by index rather than its name. There "i" can be replaced with any integer type expression (including integer constant). For example, probof[0] in the preselector in the Dungeon demo refer to the rule probability for "hDivide". This syntax is also used in referencing rule probabilities in generator configurations (see [Here]).

    • forbid tArea, bossArea;

      • is a statement forbiding the rules named "tArea" and "vArea". This is a example of a type of statements we call forbid statements. The rules to be forbidden are listed at the end, and this implemented as setting their rule probability to zero. The rules can also be referred to with indices like

      • forbid [2], [3];

      • with the convention that indices are wrapped in brackets.

    • forbid <transferto bossArea> tArea;

      • is an example of using an option (wrapped in < and >) in forbid statements. It forbids the rule "tArea" by setting the its probability to be zero, but add its original probability (the sum of the original probabilities of the forbidden rules) to the rule "bossArea" (the rule specified after transferto). Again, any of the rules here can be alternatively referred to by indices (in brackets). Note that it can have multiple rules forbidden, but only one rule can be used to transfer the probability into.

    • forbid <normalize> tArea;

      • uses another type of options in the forbid statement (normalization). It normalizes all probabilities as soon as setting the ones forbidden to zeros (or negative). Note that the default item type option will implicitly add a normalization at the end but this provides way to force normalization immediately. In addition, we can scale the normalization to a different total (by default it is 1.0) by using statements like

      • forbid <normalizeto 100.0> tArea;

      • where this 100.0 is just an example of probability scaling, but in general it can be any expression compatible with double precision floating point type. Again, any of the rules here can be alternatively referred to by indices (in brackets).

    • forbidexcept hDivide, vDivide;

      • is a variant type of forbid statements that forbid all rules except the ones specified. The "forbidexcept" type statements can uses the same set of options (transferring, normalization) as regular forbid statements, and they applies the same way except that here the rules forbidden are the ones not listed. Again, any of the rules here can be alternatively referred to by indices (in brackets), with slight caution that using forbid except statement with the exempted rules referenced by indices is often less optimal in performance than the case where they are referenced by names.

    • force tArea;

      • forces selecting the specified rule (only a single rule). This forcing works even when the rule probability is originally set to zero. The rule here can also be alternatively referred to by indices (in brackets). Currently there is a discrepancy in that, "force" statements using a rule name will return from the pre-selector (therefore ignoring the statements afterwards in the pre-selector), while "force" statements using an index (as well as all versions of "forbid" and "forbidexcept" statements) will continue to execute the pre-selector (thus may have the danger to negate the forcing effect if not careful). The safest practice is not to have essential statements after this type of statements.

    • nonegative probs;

      • preforms the correction of negative probability (to zero) that would be similarly carried out at the end (before the default normalization if any) with the default item type option.

    • normalize probs;

      • normalizes the probability to 1.0, which is also one of the default action at the end with the default item type option. A variation of this can scale the normalization to a different total, like

      • normalize probs to 100.0;

      • where this 100.0 is just an example of probability scaling, but in general it can be any expression compatible with double precision floating point type.

    • The control or constraint provided by the pre-selector blocks can achieve very similar effects to those by the lambda configuration. For example, the depth limit in the Tree demo may be implemented by the pre-selector, by forbidding the "termTree" rule when the "depth" reaches "n"; and that all the mechanisms in the pre-selector in the Dungeon example can be encoded (with some efforts) into lambda expression that calls other functions that encodes those mechanisms. However, in general, both features are still designed for different purposes.

      • The main point about the discussion about type-wise and instance-wise which is also mentioned in [Here]. Pre-selectors encode type-wise mechanism, i.e. the controls and constraints there applies to all instances of item generation, while configure parameters encode instance-wise mechanism, i.e. only applying to item generations set by that instance of generator configuration. Therefore the main fact in deciding whether to use pre-selector or to use configure parameters, is to decide whether the mechanism is intended to be type-wise or instance-wise. The depth limit in the Tree demo can actually be considered type-wise, as all configuration we show has the same interpretation about the depth limit in terms of not exceeding a hard threshhold (it is encoded with configure parameters only because we want to introduce features gradually), however, if there is need for another configuration for a soft limit (i.e. termination probability gradually increase over depth) then the current implementation, i.e. using configure parameters, may make more sense. For the Dungeon demo, using the pre-selector as the current implementation does is a good choice if the unit area size control and the boss room mechanism is intended to apply to the item type, but it could be different if it is designed otherwise.

      • A secondary factor for deciding pre-selectors versus configure parameter is that pre-selectors are more direct in applying constraints. When using configure parameters, as they have to be set with expressions, many logics have to be encoded with 0 and 1, or C++ conditional expressions, or an separately defined a function that encodes the logic, and it needs to be considered for all rule probabilities that are affected. When using pre-selectors these can be much more direct, as pre-selectors themselves are blocks of statements, which can encode the logic directly. For example, encoding the constraint mechanism for the Dungeon demo with configure parameters is possible, but we may have to define separately four additional functions, one for each rule probability then call them in the lambda expressions to set the probability in the generator configuration.