Merge pull request #15 from salt-formulas/develop
Merge Develop to Master
diff --git a/MANIFEST.in b/MANIFEST.in
index 1c1accc..1f4c27b 100644
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -4,9 +4,11 @@
exclude Makefile requirements.txt .pylintrc reclass.py
# Exclude testing infra
exclude run_tests.py
+prune reclass/tests
prune reclass/datatypes/tests
prune reclass/storage/tests
prune reclass/utils/tests
+prune reclass/values/tests
# Exclude "source only" content
prune doc
prune examples
diff --git a/README-extentions.rst b/README-extentions.rst
new file mode 100644
index 0000000..97d78af
--- /dev/null
+++ b/README-extentions.rst
@@ -0,0 +1,381 @@
+Escaping of References and Inventory Queries
+--------------------------------------------
+
+Reference and inventory queries can be escaped to produce literal strings, for example:
+
+.. code-block:: yaml
+
+ parameters:
+ colour: Blue
+ unescaped: The colour is ${colour}
+ escaped: The colour is \${colour}
+ double_escaped: The colour is \\${colour}
+
+
+This would produce:
+
+.. code-block:: yaml
+
+ parameters:
+ colour: Blue
+ unescaped: The colour is Blue
+ escaped: The colour is ${colour}
+ double_escaped: The colour is \Blue
+
+
+
+Ignore class not found
+----------------------
+
+At some cases (bootstrapping, development) it can be convenient to ignore some missing classes.
+To control the feature there are two options available:
+
+.. code-block:: yaml
+
+ ignore_class_notfound: False
+ ignore_class_regexp: ['.*']
+
+If you set regexp pattern to ``service.*`` all missing classes starting 'service.' will be logged with warning, but will not
+fail to return rendered reclass. Assuming all parameter interpolation passes.
+
+
+
+Merging Referenced Lists and Dictionaries
+-----------------------------------------
+
+Referenced lists or dicts can now be merged:
+
+.. code-block:: yaml
+
+ # nodes/test.yml
+ classes:
+ - test1
+ - test2
+ parameters:
+ one:
+ a: 1
+ b: 2
+ two:
+ c: 3
+ d: 4
+ three:
+ e: 5
+
+ # classes/test1.yml
+ parameters:
+ three: ${one}
+
+ # classes/test2.yml
+ parameters:
+ three: ${two}
+
+``running reclass.py --nodeinfo node1`` then gives:
+
+.. code-block:: yaml
+
+ parameters:
+ one:
+ a: 1
+ b: 2
+ three:
+ a: 1
+ b: 2
+ c: 3
+ d: 4
+ e: 5
+ two:
+ c: 3
+ d: 4
+
+This first sets the parameter three to the value of parameter one (class test1) then merges parameter two into
+parameter three (class test2) and finally merges the parameter three definition given in the node definition into
+the final value.
+
+
+Allow override list and dicts by empty entity,None instead of merge
+-------------------------------------------------------------------
+
+With settings:
+
+.. code-block:: yaml
+
+ allow_none_override: True # default True
+
+ # note dict,list over None is allowed and not configurable
+
+Referenced lists or dicts can now be overriden by None or empty type of dict, list:
+
+.. code-block:: yaml
+
+ # nodes/test.yml
+ parameters:
+ one:
+ a: 1
+ b: 2
+ two: {}
+ three: None
+
+ # classes/test1.yml
+ parameters:
+ one: ${two}
+
+ # classes/test2.yml
+ parameters:
+ three: ${one}
+
+
+Nested References
+-----------------
+
+References can now be nested, for example:
+
+.. code-block:: yaml
+
+ # nodes/node1.yml
+ parameters:
+ alpha:
+ one: ${beta:${alpha:two}}
+ two: a
+ beta:
+ a: 99
+
+``reclass.py --nodeinfo node1`` then gives:
+
+.. code-block:: yaml
+
+ parameters:
+ alpha:
+ one: 99
+ two: a
+ beta:
+ a: 99
+
+The ``${beta:${alpha:two}}`` construct first resolves the ``${alpha:two}`` reference to the value 'a', then resolves
+the reference ``${beta:a}`` to the value 99.
+
+
+Ignore overwritten missing references
+-------------------------
+
+Given the following classes:
+
+.. code-block:: yaml
+ # node1.yml
+ classes:
+ - class1
+ - class2
+ - class3
+
+ # class1.yml
+ parameters:
+ a: ${x}
+
+ # class2.yml
+ parameters:
+ a: ${y}
+
+ # class3.yml
+ parameters:
+ y: 1
+
+
+The parameter ``a`` only depends on the parameter ``y`` through the reference set in class2. The fact that the parameter ``x`` referenced
+in class1 is not defined does not affect the final value of the parameter ``a``. For such overwritten missing references by default a warning is
+printed but no error is raised, providing the final value of the parameter being evaluated is a scalar. If the final value is a dictionary or list
+an error will always be raised in the case of a missing reference.
+
+Default value is True to keep backward compatible behavior.
+
+.. code-block:: yaml
+
+ ignore_overwritten_missing_reference: True
+
+
+Print summary of missed references
+----------------------------------
+
+Instead of failing on the first undefinded reference error all missing reference errors are printed at once.
+
+.. code-block:: yaml
+ reclass --nodeinfo mynode
+ -> dontpanic
+ Cannot resolve ${_param:kkk}, at mkkek3:tree:to:fail, in yaml_fs:///test/classes/third.yml
+ Cannot resolve ${_param:kkk}, at mkkek3:tree:another:xxxx, in yaml_fs:///test/classes/third.yml
+ Cannot resolve ${_param:kkk}, at mykey2:tree:to:fail, in yaml_fs:///test/classes/third.yml
+
+.. code-block:: yaml
+
+ group_errors: True
+
+
+Inventory Queries
+-----------------
+
+Inventory querying works using a new key type - exports to hold values which other node definitions can read using a $[] query, for example with:
+
+.. code-block:: yaml
+
+ # nodes/node1.yml
+ exports:
+ test_zero: 0
+ test_one:
+ name: ${name}
+ value: 6
+ test_two: ${dict}
+
+ parameters:
+ name: node1
+ dict:
+ a: 1
+ b: 2
+ exp_value_test: $[ exports:test_two ]
+ exp_if_test0: $[ if exports:test_zero == 0 ]
+ exp_if_test1: $[ exports:test_one if exports:test_one:value == 7 ]
+ exp_if_test2: $[ exports:test_one if exports:test_one:name == self:name ]
+
+ # nodes/node2.yml
+ exports:
+ test_zero: 0
+ test_one:
+ name: ${name}
+ value: 7
+ test_two: ${dict}
+
+ parameters:
+ name: node2
+ dict:
+ a: 11
+ b: 22
+
+
+``running reclass.py --nodeinfo node1`` gives (listing only the exports and parameters):
+
+.. code-block:: yaml
+
+ exports:
+ test_one:
+ name: node1
+ value: 6
+ test_two:
+ a: 1
+ b: 2
+ parameters:
+ dict:
+ a: 1
+ b: 2
+ exp_if_test0:
+ - node1
+ - node2
+ exp_if_test1:
+ node2:
+ name: node2
+ value: 7
+ exp_if_test2:
+ node1:
+ name: node1
+ value: 6
+ exp_value_test:
+ node1:
+ a: 1
+ b: 2
+ node2:
+ a: 11
+ b: 22
+ name: node1
+
+
+Exports defined for a node can be a simple value or a reference to a parameter in the node definition.
+The ``$[]`` inventory queries are calculated for simple value expressions, ``$[ exports:key ]``, by returning
+a dictionary with an element (``{ node_name: key value }``) for each node which defines 'key' in the exports
+section. For tests with a preceeding value, ``$[ exports:key if exports:test_key == test_value ]``, the
+element (``{ node_name: key value }``) is only added to the returned dictionary if the test_key defined in
+the node exports section equals the test value. For tests without a preceeding value,
+``$[ if exports:test_key == test_value ]``, a list of nodes which pass the test is returned. For either test
+form the test value can either be a simple value or a node parameter. And as well as an equality test
+a not equals test (``!=``) can also be used.
+
+
+**Inventory query options**
+
+By default inventory queries only look at nodes in the same environment as the querying node. This can be
+overriden using the +AllEnvs option:
+
+.. code-block:: yaml
+
+ $[ +AllEnvs exports:test ]
+
+Any errors in rendering the export parameters for a node will give an error for the inventory query as a whole.
+This can be overriden using the ``+IgnoreErrors`` option:
+
+.. code-block:: yaml
+
+ $[ +IgnoreErrors exports:test ]
+
+With the ``+IgnoreErrors`` option nodes which generate an error evaluating ``exports:test`` will be ignored.
+
+Inventory query options can be combined:
+
+.. code-block:: yaml
+
+ $[ +AllEnvs +IgnoreErrors exports:test ]
+
+**Logical operators and/or**
+
+The logical operators and/or can be used in inventory queries:
+
+.. code-block:: yaml
+
+ $[ exports:test_value if exports:test_zero == 0 and exports:test_one == self:value ]
+
+The individual elements of the if statement are evaluated and combined with the logical operators starting from the
+left and working to the right.
+
+
+**Inventory query example**
+
+Defining a cluster of machines using an inventory query, for example to open access to a database server to a
+group of nodes. Given exports/parameters for nodes of the form:
+
+.. code-block:: yaml
+
+ # for all nodes requiring access to the database server
+ exports:
+ host:
+ ip_address: aaa.bbb.ccc.ddd
+ cluster: _some_cluster_name_
+
+.. code-block:: yaml
+
+ # for the database server
+ parameters:
+ cluster_name: production-cluster
+ postgresql:
+ server:
+ clients: $[ exports:host:ip_address if exports:cluster == self:cluster_name ]
+
+This will generate a dictionary with an entry for node where the ``export:cluster`` key for a node is equal to the
+``parameter:cluster_name`` key of the node on which the inventory query is run on. Each entry in the generated dictionary
+will contain the value of the ``exports:host:ip_address`` key. The output dictionary (depending on node definitions)
+would look like:
+
+.. code-block:: yaml
+
+ node1:
+ ip_address: aaa.bbb.ccc.ddd
+ node2:
+ ip_address: www.xxx.yyy.zzz
+
+For nodes where exports:cluster key is not defined or where the key is not equal to self:cluster_name no entry is made
+in the output dictionary.
+
+In practise the exports:cluster key can be set using a parameter reference:
+
+.. code-block:: yaml
+
+ exports:
+ cluster: ${cluster_name}
+ parameters:
+ cluster_name: production-cluster
+
+The above exports and parameter definitions could be put into a separate class and then included by nodes which require
+access to the database and included by the database server as well.
diff --git a/README.rst b/README.rst
index e88c135..99de5f6 100644
--- a/README.rst
+++ b/README.rst
@@ -1,5 +1,26 @@
-reclass README
-==============
+Reclass README
+=========================
-The documentation for **reclass** is available from
-http://reclass.pantsfullofunix.net.
+This is the fork of original **reclass** that is available at:
+https://github.com/madduck/reclass
+
+Extentions
+==========
+
+List of the core features:
+
+* Escaping of References and Inventory Queries
+* Merging Referenced Lists and Dictionaries
+* Nested References
+* Inventory Queries
+* Ignore class notfound/regexp option
+
+
+.. include:: ./README-extensions.rst
+
+
+Documentation
+=============
+
+Documentation covering the original version is in the doc directory.
+See the README-extensions.rst file for documentation on the extentions.
diff --git a/doc/source/conf.py b/doc/source/conf.py
index 422128e..6ce7f02 100644
--- a/doc/source/conf.py
+++ b/doc/source/conf.py
@@ -16,7 +16,7 @@
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
-#sys.path.insert(0, os.path.abspath('.'))
+sys.path.insert(0, os.path.abspath('../../'))
# -- General configuration -----------------------------------------------------
diff --git a/reclass/__init__.py b/reclass/__init__.py
index 7cd6c30..adb421e 100644
--- a/reclass/__init__.py
+++ b/reclass/__init__.py
@@ -7,16 +7,18 @@
# Released under the terms of the Artistic Licence 2.0
#
-from output import OutputLoader
-from storage.loader import StorageBackendLoader
-from storage.memcache_proxy import MemcacheProxy
+from reclass.output import OutputLoader
+from reclass.storage.loader import StorageBackendLoader
+from reclass.storage.memcache_proxy import MemcacheProxy
def get_storage(storage_type, nodes_uri, classes_uri, **kwargs):
storage_class = StorageBackendLoader(storage_type).load()
return MemcacheProxy(storage_class(nodes_uri, classes_uri, **kwargs))
+def get_path_mangler(storage_type,**kwargs):
+ return StorageBackendLoader(storage_type).path_mangler()
-def output(data, fmt, pretty_print=False):
+def output(data, fmt, pretty_print=False, no_refs=False):
output_class = OutputLoader(fmt).load()
outputter = output_class()
- return outputter.dump(data, pretty_print=pretty_print)
+ return outputter.dump(data, pretty_print=pretty_print, no_refs=no_refs)
diff --git a/reclass/adapters/ansible.py b/reclass/adapters/ansible.py
index cbf5f17..1887245 100755
--- a/reclass/adapters/ansible.py
+++ b/reclass/adapters/ansible.py
@@ -10,6 +10,9 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
+# 2017.08.08 Andew Pickford <anpickford@googlemail.com>
+# The ansible adapter has received little testing and may not work at all now.
+
import os, sys, posix, optparse
@@ -19,6 +22,7 @@
from reclass.config import find_and_read_configfile, get_options
from reclass.version import *
from reclass.constants import MODE_NODEINFO
+from reclass.settings import Settings
def cli():
try:
@@ -27,6 +31,7 @@
ansible_dir = os.path.abspath(os.path.dirname(sys.argv[0]))
defaults = {'inventory_base_uri': ansible_dir,
+ 'no_refs' : False,
'pretty_print' : True,
'output' : 'json',
'applications_postfix': '_hosts'
@@ -54,10 +59,11 @@
add_options_cb=add_ansible_options_group,
defaults=defaults)
- storage = get_storage(options.storage_type, options.nodes_uri,
- options.classes_uri)
+ storage = get_storage(options.storage_type, options.nodes_uri, options.classes_uri)
class_mappings = defaults.get('class_mappings')
- reclass = Core(storage, class_mappings)
+ defaults.update(vars(options))
+ settings = Settings(defaults)
+ reclass = Core(storage, class_mappings, settings)
if options.mode == MODE_NODEINFO:
data = reclass.nodeinfo(options.hostname)
@@ -81,7 +87,7 @@
data = groups
- print output(data, options.output, options.pretty_print)
+ print output(data, options.output, options.pretty_print, options.no_refs)
except ReclassException, e:
e.exit_with_message(sys.stderr)
diff --git a/reclass/adapters/salt.py b/reclass/adapters/salt.py
index d382632..54adf5a 100755
--- a/reclass/adapters/salt.py
+++ b/reclass/adapters/salt.py
@@ -9,13 +9,13 @@
import os, sys, posix
-from reclass import get_storage, output
+from reclass import get_storage, output, get_path_mangler
from reclass.core import Core
from reclass.errors import ReclassException
-from reclass.config import find_and_read_configfile, get_options, \
- path_mangler
+from reclass.config import find_and_read_configfile, get_options
from reclass.constants import MODE_NODEINFO
from reclass.defaults import *
+from reclass.settings import Settings
from reclass.version import *
def ext_pillar(minion_id, pillar,
@@ -25,19 +25,16 @@
classes_uri=OPT_CLASSES_URI,
class_mappings=None,
propagate_pillar_data_to_reclass=False,
- ignore_class_notfound=OPT_IGNORE_CLASS_NOTFOUND,
- ignore_class_regexp=OPT_IGNORE_CLASS_REGEXP):
+ **kwargs):
- nodes_uri, classes_uri = path_mangler(inventory_base_uri,
- nodes_uri, classes_uri)
- storage = get_storage(storage_type, nodes_uri, classes_uri,
- default_environment='base')
+ path_mangler = get_path_mangler(storage_type)
+ nodes_uri, classes_uri = path_mangler(inventory_base_uri, nodes_uri, classes_uri)
+ storage = get_storage(storage_type, nodes_uri, classes_uri)
input_data = None
if propagate_pillar_data_to_reclass:
input_data = pillar
- reclass = Core(storage, class_mappings, input_data=input_data,
- ignore_class_notfound=ignore_class_notfound,
- ignore_class_regexp=ignore_class_regexp)
+ settings = Settings(kwargs)
+ reclass = Core(storage, class_mappings, settings, input_data=input_data)
data = reclass.nodeinfo(minion_id)
params = data.get('parameters', {})
@@ -51,18 +48,13 @@
def top(minion_id, storage_type=OPT_STORAGE_TYPE,
inventory_base_uri=OPT_INVENTORY_BASE_URI, nodes_uri=OPT_NODES_URI,
- classes_uri=OPT_CLASSES_URI,
- class_mappings=None,
- ignore_class_notfound=OPT_IGNORE_CLASS_NOTFOUND,
- ignore_class_regexp=OPT_IGNORE_CLASS_REGEXP):
+ classes_uri=OPT_CLASSES_URI, class_mappings=None, **kwargs):
- nodes_uri, classes_uri = path_mangler(inventory_base_uri,
- nodes_uri, classes_uri)
- storage = get_storage(storage_type, nodes_uri, classes_uri,
- default_environment='base')
- reclass = Core(storage, class_mappings, input_data=None,
- ignore_class_notfound=ignore_class_notfound,
- ignore_class_regexp=ignore_class_regexp)
+ path_mangler = get_path_mangler(storage_type)
+ nodes_uri, classes_uri = path_mangler(inventory_base_uri, nodes_uri, classes_uri)
+ storage = get_storage(storage_type, nodes_uri, classes_uri)
+ settings = Settings(kwargs)
+ reclass = Core(storage, class_mappings, settings, input_data=None)
# if the minion_id is not None, then return just the applications for the
# specific minion, otherwise return the entire top data (which we need for
@@ -89,6 +81,7 @@
try:
inventory_dir = os.path.abspath(os.path.dirname(sys.argv[0]))
defaults = {'pretty_print' : True,
+ 'no_refs' : False,
'output' : 'yaml',
'inventory_base_uri': inventory_dir
}
@@ -103,6 +96,12 @@
nodeinfo_help='output pillar data for a specific node',
defaults=defaults)
class_mappings = defaults.get('class_mappings')
+ defaults.update(vars(options))
+ defaults.pop("storage_type", None)
+ defaults.pop("inventory_base_uri", None)
+ defaults.pop("nodes_uri", None)
+ defaults.pop("classes_uri", None)
+ defaults.pop("class_mappings", None)
if options.mode == MODE_NODEINFO:
data = ext_pillar(options.nodename, {},
@@ -111,8 +110,7 @@
nodes_uri=options.nodes_uri,
classes_uri=options.classes_uri,
class_mappings=class_mappings,
- ignore_class_notfound=options.ignore_class_notfound,
- ignore_class_regexp=options.ignore_class_regexp)
+ **defaults)
else:
data = top(minion_id=None,
storage_type=options.storage_type,
@@ -120,10 +118,9 @@
nodes_uri=options.nodes_uri,
classes_uri=options.classes_uri,
class_mappings=class_mappings,
- ignore_class_notfound=options.ignore_class_notfound,
- ignore_class_regexp=options.ignore_class_regexp)
+ **defaults)
- print output(data, options.output, options.pretty_print)
+ print output(data, options.output, options.pretty_print, options.no_refs)
except ReclassException, e:
e.exit_with_message(sys.stderr)
diff --git a/reclass/cli.py b/reclass/cli.py
index d7e22b7..d1b22b8 100644
--- a/reclass/cli.py
+++ b/reclass/cli.py
@@ -11,33 +11,34 @@
from reclass import get_storage, output
from reclass.core import Core
+from reclass.settings import Settings
from reclass.config import find_and_read_configfile, get_options
-from reclass.errors import ReclassException
from reclass.defaults import *
+from reclass.errors import ReclassException
from reclass.constants import MODE_NODEINFO
from reclass.version import *
def main():
try:
- defaults = {'pretty_print' : OPT_PRETTY_PRINT,
+ defaults = {'no_refs' : OPT_NO_REFS,
+ 'pretty_print' : OPT_PRETTY_PRINT,
'output' : OPT_OUTPUT
}
defaults.update(find_and_read_configfile())
- options = get_options(RECLASS_NAME, VERSION, DESCRIPTION,
- defaults=defaults)
- storage = get_storage(options.storage_type, options.nodes_uri,
- options.classes_uri, default_environment='base')
+ options = get_options(RECLASS_NAME, VERSION, DESCRIPTION, defaults=defaults)
+ storage = get_storage(options.storage_type, options.nodes_uri, options.classes_uri)
class_mappings = defaults.get('class_mappings')
- reclass = Core(storage, class_mappings, ignore_class_notfound=options.ignore_class_notfound, ignore_class_regexp=options.ignore_class_regexp)
+ defaults.update(vars(options))
+ settings = Settings(defaults)
+ reclass = Core(storage, class_mappings, settings)
if options.mode == MODE_NODEINFO:
data = reclass.nodeinfo(options.nodename)
-
else:
data = reclass.inventory()
- print output(data, options.output, options.pretty_print)
+ print output(data, options.output, options.pretty_print, options.no_refs)
except ReclassException, e:
e.exit_with_message(sys.stderr)
diff --git a/reclass/config.py b/reclass/config.py
index 36fbf5a..e9bb43b 100644
--- a/reclass/config.py
+++ b/reclass/config.py
@@ -12,6 +12,7 @@
import errors
from defaults import *
from constants import MODE_NODEINFO, MODE_INVENTORY
+from reclass import get_path_mangler
def make_db_options_group(parser, defaults={}):
ret = optparse.OptionGroup(parser, 'Database options',
@@ -20,21 +21,20 @@
default=defaults.get('storage_type', OPT_STORAGE_TYPE),
help='the type of storage backend to use [%default]')
ret.add_option('-b', '--inventory-base-uri', dest='inventory_base_uri',
- default=defaults.get('inventory_base_uri',
- OPT_INVENTORY_BASE_URI),
+ default=defaults.get('inventory_base_uri', OPT_INVENTORY_BASE_URI),
help='the base URI to prepend to nodes and classes [%default]'),
ret.add_option('-u', '--nodes-uri', dest='nodes_uri',
default=defaults.get('nodes_uri', OPT_NODES_URI),
help='the URI to the nodes storage [%default]'),
ret.add_option('-c', '--classes-uri', dest='classes_uri',
default=defaults.get('classes_uri', OPT_CLASSES_URI),
- help='the URI to the classes storage [%default]')
+ help='the URI to the classes storage [%default]'),
ret.add_option('-z', '--ignore-class-notfound', dest='ignore_class_notfound',
default=defaults.get('ignore_class_notfound', OPT_IGNORE_CLASS_NOTFOUND),
help='decision for not found classes [%default]')
- ret.add_option('-x', '--ignore-class-regexp', dest='ignore_class_regexp',
- default=defaults.get('ignore_class_regexp', OPT_IGNORE_CLASS_REGEXP),
- help='decision for not found classes [%default]')
+ ret.add_option('-x', '--ignore-class-notfound-regexp', dest='ignore_class_notfound_regexp',
+ default=defaults.get('ignore_class_notfound_regexp', OPT_IGNORE_CLASS_NOTFOUND_REGEXP),
+ help='regexp for not found classes [%default]')
return ret
@@ -44,10 +44,17 @@
ret.add_option('-o', '--output', dest='output',
default=defaults.get('output', OPT_OUTPUT),
help='output format (yaml or json) [%default]')
- ret.add_option('-y', '--pretty-print', dest='pretty_print',
- action="store_true",
+ ret.add_option('-y', '--pretty-print', dest='pretty_print', action="store_true",
default=defaults.get('pretty_print', OPT_PRETTY_PRINT),
help='try to make the output prettier [%default]')
+ ret.add_option('-r', '--no-refs', dest='no_refs', action="store_true",
+ default=defaults.get('no_refs', OPT_NO_REFS),
+ help='output all key values do not use yaml references [%default]')
+ ret.add_option('-1', '--single-error', dest='group_errors', action="store_false",
+ default=defaults.get('group_errors', OPT_GROUP_ERRORS),
+ help='throw errors immediately instead of grouping them together')
+ ret.add_option('-0', '--multiple-errors', dest='group_errors', action="store_true",
+ help='were possible report any errors encountered as a group')
return ret
@@ -134,30 +141,6 @@
return parser, option_checker
-def path_mangler(inventory_base_uri, nodes_uri, classes_uri):
-
- if inventory_base_uri is None:
- # if inventory_base is not given, default to current directory
- inventory_base_uri = os.getcwd()
-
- nodes_uri = nodes_uri or 'nodes'
- classes_uri = classes_uri or 'classes'
-
- def _path_mangler_inner(path):
- ret = os.path.join(inventory_base_uri, path)
- ret = os.path.expanduser(ret)
- return os.path.abspath(ret)
-
- n, c = map(_path_mangler_inner, (nodes_uri, classes_uri))
- if n == c:
- raise errors.DuplicateUriError(n, c)
- common = os.path.commonprefix((n, c))
- if common == n or common == c:
- raise errors.UriOverlapError(n, c)
-
- return n, c
-
-
def get_options(name, version, description,
inventory_shortopt='-i',
inventory_longopt='--inventory',
@@ -181,9 +164,8 @@
options, args = parser.parse_args()
checker(options, args)
- options.nodes_uri, options.classes_uri = \
- path_mangler(options.inventory_base_uri, options.nodes_uri,
- options.classes_uri)
+ path_mangler = get_path_mangler(options.storage_type)
+ options.nodes_uri, options.classes_uri = path_mangler(options.inventory_base_uri, options.nodes_uri, options.classes_uri)
return options
diff --git a/reclass/core.py b/reclass/core.py
index 32dee20..9da0ddb 100644
--- a/reclass/core.py
+++ b/reclass/core.py
@@ -7,28 +7,28 @@
# Released under the terms of the Artistic Licence 2.0
#
+import copy
import time
-#import types
import re
-import sys
import fnmatch
import shlex
-from reclass.datatypes import Entity, Classes, Parameters
-from reclass.errors import MappingFormatError, ClassNotFound
+import string
+import sys
+import yaml
+from reclass.settings import Settings
+from reclass.output.yaml_outputter import ExplicitDumper
+from reclass.datatypes import Entity, Classes, Parameters, Exports
+from reclass.errors import MappingFormatError, ClassNotFound, InvQueryClassNotFound, InvQueryError, InterpolationError
class Core(object):
- def __init__(self, storage, class_mappings, input_data=None,
- ignore_class_notfound=False, ignore_class_regexp=['.*']):
+ def __init__(self, storage, class_mappings, settings, input_data=None):
self._storage = storage
self._class_mappings = class_mappings
- self._ignore_class_notfound = ignore_class_notfound
+ self._settings = settings
self._input_data = input_data
-
- if isinstance(ignore_class_regexp, basestring):
- self._ignore_class_regexp = [ignore_class_regexp]
- else:
- self._ignore_class_regexp = ignore_class_regexp
+ if self._settings.ignore_class_notfound:
+ self._cnf_r = re.compile('|'.join([x for x in self._settings.ignore_class_notfound_regexp]))
@staticmethod
def _get_timestamp():
@@ -63,7 +63,7 @@
def _get_class_mappings_entity(self, nodename):
if not self._class_mappings:
- return Entity(name='empty (class mappings)')
+ return Entity(self._settings, name='empty (class mappings)')
c = Classes()
for mapping in self._class_mappings:
matched = False
@@ -79,40 +79,42 @@
for klass in klasses:
c.append_if_new(klass)
- return Entity(classes=c,
+ return Entity(self._settings, classes=c,
name='class mappings for node {0}'.format(nodename))
def _get_input_data_entity(self):
if not self._input_data:
- return Entity(name='empty (input data)')
- p = Parameters(self._input_data)
- return Entity(parameters=p, name='input data')
+ return Entity(self._settings, name='empty (input data)')
+ p = Parameters(self._input_data, self._settings)
+ return Entity(self._settings, parameters=p, name='input data')
- def _recurse_entity(self, entity, merge_base=None, seen=None, nodename=None):
+ def _recurse_entity(self, entity, merge_base=None, seen=None, nodename=None, environment=None):
if seen is None:
seen = {}
- if merge_base is None:
- merge_base = Entity(name='empty (@{0})'.format(nodename))
+ if environment is None:
+ environment = self._settings.default_environment
- cnf_r = None # class_notfound_regexp compiled
+ if merge_base is None:
+ merge_base = Entity(self._settings, name='empty (@{0})'.format(nodename))
+
for klass in entity.classes.as_list():
if klass not in seen:
try:
- class_entity = self._storage.get_class(klass)
- except ClassNotFound, e:
- if self._ignore_class_notfound:
- if not cnf_r:
- cnf_r = re.compile('|'.join(self._ignore_class_regexp))
- if cnf_r.match(klass):
- # TODO, add logging handler
- print >>sys.stderr, "[WARNING] Reclass class not found: '%s'. Skipped!" % klass
+ class_entity = self._storage.get_class(klass, environment, self._settings)
+ except ClassNotFound as e:
+ if self._settings.ignore_class_notfound:
+ if self._cnf_r.match(klass):
+ if self._settings.ignore_class_notfound_warning:
+ # TODO, add logging handler
+ print >>sys.stderr, "[WARNING] Reclass class not found: '%s'. Skipped!" % klass
continue
- e.set_nodename(nodename)
- raise e
+ e.nodename = nodename
+ e.uri = entity.uri
+ raise
descent = self._recurse_entity(class_entity, seen=seen,
- nodename=nodename)
+ nodename=nodename, environment=environment)
# on every iteration, we merge the result of the recursive
# descent into what we have so far…
merge_base.merge(descent)
@@ -124,18 +126,72 @@
merge_base.merge(entity)
return merge_base
- def _nodeinfo(self, nodename):
- node_entity = self._storage.get_node(nodename)
- base_entity = Entity(name='base')
+ def _get_automatic_parameters(self, nodename, environment):
+ if self._settings.automatic_parameters:
+ return Parameters({ '_reclass_': { 'name': { 'full': nodename, 'short': string.split(nodename, '.')[0] },
+ 'environment': environment } }, self._settings, '__auto__')
+ else:
+ return Parameters({}, self._settings, '')
+
+ def _get_inventory(self, all_envs, environment, queries):
+ inventory = {}
+ for nodename in self._storage.enumerate_nodes():
+ try:
+ node_base = self._storage.get_node(nodename, self._settings)
+ if node_base.environment == None:
+ node_base.environment = self._settings.default_environment
+ except yaml.scanner.ScannerError as e:
+ if self._settings.inventory_ignore_failed_node:
+ continue
+ else:
+ raise
+
+ if all_envs or node_base.environment == environment:
+ try:
+ node = self._node_entity(nodename)
+ except ClassNotFound as e:
+ raise InvQueryClassNotFound(e)
+ if queries is None:
+ try:
+ node.interpolate_exports()
+ except InterpolationError as e:
+ e.nodename = nodename
+ else:
+ node.initialise_interpolation()
+ for p, q in queries:
+ try:
+ node.interpolate_single_export(q)
+ except InterpolationError as e:
+ e.nodename = nodename
+ raise InvQueryError(q.contents(), e, context=p, uri=q.uri())
+ inventory[nodename] = node.exports.as_dict()
+ return inventory
+
+ def _node_entity(self, nodename):
+ node_entity = self._storage.get_node(nodename, self._settings)
+ if node_entity.environment == None:
+ node_entity.environment = self._settings.default_environment
+ base_entity = Entity(self._settings, name='base')
base_entity.merge(self._get_class_mappings_entity(node_entity.name))
base_entity.merge(self._get_input_data_entity())
+ base_entity.merge_parameters(self._get_automatic_parameters(nodename, node_entity.environment))
seen = {}
- merge_base = self._recurse_entity(base_entity, seen=seen,
- nodename=base_entity.name)
- ret = self._recurse_entity(node_entity, merge_base, seen=seen,
- nodename=node_entity.name)
- ret.interpolate()
- return ret
+ merge_base = self._recurse_entity(base_entity, seen=seen, nodename=nodename,
+ environment=node_entity.environment)
+ return self._recurse_entity(node_entity, merge_base, seen=seen, nodename=nodename,
+ environment=node_entity.environment)
+
+ def _nodeinfo(self, nodename, inventory):
+ try:
+ node = self._node_entity(nodename)
+ node.initialise_interpolation()
+ if node.parameters.has_inv_query() and inventory is None:
+ inventory = self._get_inventory(node.parameters.needs_all_envs(), node.environment, node.parameters.get_inv_queries())
+ node.interpolate(inventory)
+ return node
+ except InterpolationError as e:
+ e.nodename = nodename
+ raise
def _nodeinfo_as_dict(self, nodename, entity):
ret = {'__reclass__' : {'node': entity.name, 'name': nodename,
@@ -148,12 +204,18 @@
return ret
def nodeinfo(self, nodename):
- return self._nodeinfo_as_dict(nodename, self._nodeinfo(nodename))
+ return self._nodeinfo_as_dict(nodename, self._nodeinfo(nodename, None))
def inventory(self):
+ query_nodes = set()
entities = {}
+ inventory = self._get_inventory(True, '', None)
for n in self._storage.enumerate_nodes():
- entities[n] = self._nodeinfo(n)
+ entities[n] = self._nodeinfo(n, inventory)
+ if entities[n].parameters.has_inv_query():
+ nodes.add(n)
+ for n in query_nodes:
+ entities[n] = self._nodeinfo(n, inventory)
nodes = {}
applications = {}
diff --git a/reclass/datatypes/__init__.py b/reclass/datatypes/__init__.py
index 20f7551..48c4a8b 100644
--- a/reclass/datatypes/__init__.py
+++ b/reclass/datatypes/__init__.py
@@ -6,7 +6,8 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
-from applications import Applications
-from classes import Classes
-from entity import Entity
-from parameters import Parameters
+from .applications import Applications
+from .classes import Classes
+from .entity import Entity
+from .exports import Exports
+from .parameters import Parameters
diff --git a/reclass/datatypes/applications.py b/reclass/datatypes/applications.py
index d024e97..3c7afce 100644
--- a/reclass/datatypes/applications.py
+++ b/reclass/datatypes/applications.py
@@ -7,7 +7,7 @@
# Released under the terms of the Artistic Licence 2.0
#
-from classes import Classes
+from .classes import Classes
class Applications(Classes):
'''
diff --git a/reclass/datatypes/entity.py b/reclass/datatypes/entity.py
index 573a28c..b43ac72 100644
--- a/reclass/datatypes/entity.py
+++ b/reclass/datatypes/entity.py
@@ -6,9 +6,10 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
-from classes import Classes
-from applications import Applications
-from parameters import Parameters
+from .classes import Classes
+from .applications import Applications
+from .exports import Exports
+from .parameters import Parameters
class Entity(object):
'''
@@ -16,24 +17,35 @@
for merging. The name and uri of an Entity will be updated to the name and
uri of the Entity that is being merged.
'''
- def __init__(self, classes=None, applications=None, parameters=None,
- uri=None, name=None, environment=None):
+ def __init__(self, settings, classes=None, applications=None, parameters=None,
+ exports=None, uri=None, name=None, environment=None):
+ self._uri = uri or ''
+ self._name = name or ''
if classes is None: classes = Classes()
self._set_classes(classes)
if applications is None: applications = Applications()
self._set_applications(applications)
- if parameters is None: parameters = Parameters()
+ if parameters is None: parameters = Parameters(None, settings, uri)
+ if exports is None: exports = Exports(None, settings, uri)
self._set_parameters(parameters)
- self._uri = uri or ''
- self._name = name or ''
- self._environment = environment or ''
+ self._set_exports(exports)
+ self._environment = environment
name = property(lambda s: s._name)
+ short_name = property(lambda s: s._short_name)
uri = property(lambda s: s._uri)
- environment = property(lambda s: s._environment)
classes = property(lambda s: s._classes)
applications = property(lambda s: s._applications)
parameters = property(lambda s: s._parameters)
+ exports = property(lambda s: s._exports)
+
+ @property
+ def environment(self):
+ return self._environment
+
+ @environment.setter
+ def environment(self, value):
+ self._environment = value
def _set_classes(self, classes):
if not isinstance(classes, Classes):
@@ -53,22 +65,46 @@
'instance of type %s' % type(parameters))
self._parameters = parameters
+ def _set_exports(self, exports):
+ if not isinstance(exports, Exports):
+ raise TypeError('Entity.exports cannot be set to '\
+ 'instance of type %s' % type(exports))
+ self._exports = exports
+
def merge(self, other):
self._classes.merge_unique(other._classes)
self._applications.merge_unique(other._applications)
self._parameters.merge(other._parameters)
+ self._exports.merge(other._exports)
self._name = other.name
self._uri = other.uri
- self._environment = other.environment
+ if other.environment != None:
+ self._environment = other.environment
- def interpolate(self):
- self._parameters.interpolate()
+ def merge_parameters(self, params):
+ self._parameters.merge(params)
+
+ def interpolate(self, inventory):
+ self._parameters.interpolate(inventory)
+ self.interpolate_exports()
+
+ def initialise_interpolation(self):
+ self._parameters.initialise_interpolation()
+ self._exports.initialise_interpolation()
+
+ def interpolate_exports(self):
+ self.initialise_interpolation()
+ self._exports.interpolate_from_external(self._parameters)
+
+ def interpolate_single_export(self, references):
+ self._exports.interpolate_single_from_external(self._parameters, references)
def __eq__(self, other):
return isinstance(other, type(self)) \
and self._applications == other._applications \
and self._classes == other._classes \
and self._parameters == other._parameters \
+ and self._exports == other._exports \
and self._name == other._name \
and self._uri == other._uri
@@ -76,16 +112,15 @@
return not self.__eq__(other)
def __repr__(self):
- return "%s(%r, %r, %r, uri=%r, name=%r)" % (self.__class__.__name__,
- self.classes,
- self.applications,
- self.parameters,
- self.uri,
- self.name)
+ return "%s(%r, %r, %r, %r, uri=%r, name=%r, environment=%r)" % (
+ self.__class__.__name__, self.classes, self.applications,
+ self.parameters, self.exports, self.uri, self.name,
+ self.environment)
def as_dict(self):
return {'classes': self._classes.as_list(),
'applications': self._applications.as_list(),
'parameters': self._parameters.as_dict(),
+ 'exports': self._exports.as_dict(),
'environment': self._environment
}
diff --git a/reclass/datatypes/exports.py b/reclass/datatypes/exports.py
new file mode 100644
index 0000000..62ea03f
--- /dev/null
+++ b/reclass/datatypes/exports.py
@@ -0,0 +1,92 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass (http://github.com/madduck/reclass)
+#
+
+import copy
+
+from .parameters import Parameters
+from reclass.errors import ResolveError
+from reclass.values.value import Value
+from reclass.values.valuelist import ValueList
+from reclass.utils.dictpath import DictPath
+
+class Exports(Parameters):
+
+ def __init__(self, mapping, settings, uri):
+ super(Exports, self).__init__(mapping, settings, uri)
+
+ def __repr__(self):
+ return '%s(%r)' % (self.__class__.__name__, self._base)
+
+ def delete_key(self, key):
+ self._base.pop(key, None)
+ self._unrendered.pop(key, None)
+
+ def overwrite(self, other):
+ overdict = {'~' + key: value for key, value in other.iteritems()}
+ self.merge(overdict)
+
+ def interpolate_from_external(self, external):
+ while len(self._unrendered) > 0:
+ path, v = self._unrendered.iteritems().next()
+ value = path.get_value(self._base)
+ if isinstance(value, (Value, ValueList)):
+ external._interpolate_references(path, value, None)
+ new = self._interpolate_render_from_external(external._base, path, value)
+ path.set_value(self._base, new)
+ del self._unrendered[path]
+ else:
+ # references to lists and dicts are only deepcopied when merged
+ # together so it's possible a value with references in a referenced
+ # list or dict has already been rendered
+ del self._unrendered[path]
+
+ def interpolate_single_from_external(self, external, query):
+ for r in query.get_inv_references():
+ self._interpolate_single_path_from_external(r, external, query)
+
+ def _interpolate_single_path_from_external(self, mainpath, external, query):
+ required = self._get_required_paths(mainpath)
+ while len(required) > 0:
+ while len(required) > 0:
+ path, v = required.iteritems().next()
+ value = path.get_value(self._base)
+ if isinstance(value, (Value, ValueList)):
+ try:
+ external._interpolate_references(path, value, None)
+ new = self._interpolate_render_from_external(external._base, path, value)
+ path.set_value(self._base, new)
+ except ResolveError as e:
+ if query.ignore_failed_render():
+ path.delete(self._base)
+ else:
+ raise
+ del required[path]
+ del self._unrendered[path]
+ required = self._get_required_paths(mainpath)
+
+ def _get_required_paths(self, mainpath):
+ paths = {}
+ path = DictPath(self._settings.delimiter)
+ for i in mainpath.key_parts():
+ path.add_subpath(i)
+ if path in self._unrendered:
+ paths[path] = True
+ for i in self._unrendered:
+ if mainpath.is_ancestor_of(i) or mainpath == i:
+ paths[i] = True
+ return paths
+
+ def _interpolate_render_from_external(self, context, path, value):
+ try:
+ new = value.render(context, None)
+ except ResolveError as e:
+ e.context = path
+ raise
+ if isinstance(new, dict):
+ self._render_simple_dict(new, path)
+ elif isinstance(new, list):
+ self._render_simple_list(new, path)
+ return new
diff --git a/reclass/datatypes/parameters.py b/reclass/datatypes/parameters.py
index a39324e..ac15925 100644
--- a/reclass/datatypes/parameters.py
+++ b/reclass/datatypes/parameters.py
@@ -6,12 +6,15 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
+
+import copy
+import sys
import types
-from reclass.defaults import PARAMETER_INTERPOLATION_DELIMITER,\
- PARAMETER_DICT_KEY_OVERRIDE_PREFIX
+from collections import namedtuple
from reclass.utils.dictpath import DictPath
-from reclass.utils.refvalue import RefValue
-from reclass.errors import InfiniteRecursionError, UndefinedVariableError
+from reclass.values.value import Value
+from reclass.values.valuelist import ValueList
+from reclass.errors import InfiniteRecursionError, ResolveError, ResolveErrorList, InterpolationError, ParseError, BadReferencesError
class Parameters(object):
'''
@@ -36,91 +39,95 @@
To support these specialities, this class only exposes very limited
functionality and does not try to be a really mapping object.
'''
- DEFAULT_PATH_DELIMITER = PARAMETER_INTERPOLATION_DELIMITER
- DICT_KEY_OVERRIDE_PREFIX = PARAMETER_DICT_KEY_OVERRIDE_PREFIX
- def __init__(self, mapping=None, delimiter=None):
- if delimiter is None:
- delimiter = Parameters.DEFAULT_PATH_DELIMITER
- self._delimiter = delimiter
+ def __init__(self, mapping, settings, uri, merge_initialise = True):
+ self._settings = settings
self._base = {}
- self._occurrences = {}
+ self._uri = uri
+ self._unrendered = None
+ self._escapes_handled = {}
+ self._inv_queries = []
+ self._resolve_errors = ResolveErrorList()
+ self._needs_all_envs = False
+ self._keep_overrides = False
if mapping is not None:
- # we initialise by merging, otherwise the list of references might
- # not be updated
- self.merge(mapping, initmerge=True)
+ if merge_initialise:
+ # we initialise by merging
+ self._keep_overrides = True
+ self.merge(mapping)
+ self._keep_overrides = False
+ else:
+ self._base = copy.deepcopy(mapping)
- delimiter = property(lambda self: self._delimiter)
+ #delimiter = property(lambda self: self._delimiter)
def __len__(self):
return len(self._base)
def __repr__(self):
- return '%s(%r, %r)' % (self.__class__.__name__, self._base,
- self.delimiter)
+ return '%s(%r)' % (self.__class__.__name__, self._base)
def __eq__(self, other):
return isinstance(other, type(self)) \
and self._base == other._base \
- and self._delimiter == other._delimiter
+ and self._settings == other._settings
def __ne__(self, other):
return not self.__eq__(other)
+ def has_inv_query(self):
+ return len(self._inv_queries) > 0
+
+ def get_inv_queries(self):
+ return self._inv_queries
+
+ def needs_all_envs(self):
+ return self._needs_all_envs
+
+ def resolve_errors(self):
+ return self._resolve_errors
+
def as_dict(self):
return self._base.copy()
- def _update_scalar(self, cur, new, path):
- if isinstance(cur, RefValue) and path in self._occurrences:
- # If the current value already holds a RefValue, we better forget
- # the occurrence, or else interpolate() will later overwrite
- # unconditionally. If the new value is a RefValue, the occurrence
- # will be added again further on
- del self._occurrences[path]
-
- if self.delimiter is None or not isinstance(new, (types.StringTypes,
- RefValue)):
- # either there is no delimiter defined (and hence no references
- # are being used), or the new value is not a string (and hence
- # cannot be turned into a RefValue), and not a RefValue. We can
- # shortcut and just return the new scalar
- return new
-
- elif isinstance(new, RefValue):
- # the new value is (already) a RefValue, so we need not touch it
- # at all
- ret = new
-
+ def _wrap_value(self, value, path):
+ if isinstance(value, dict):
+ return self._wrap_dict(value, path)
+ elif isinstance(value, list):
+ return self._wrap_list(value, path)
+ elif isinstance(value, (Value, ValueList)):
+ return value
else:
- # the new value is a string, let's see if it contains references,
- # by way of wrapping it in a RefValue and querying the result
- ret = RefValue(new, self.delimiter)
- if not ret.has_references():
- # do not replace with RefValue instance if there are no
- # references, i.e. discard the RefValue in ret, just return
- # the new value
- return new
+ try:
+ return Value(value, self._settings, self._uri)
+ except InterpolationError as e:
+ e.context = str(path)
+ raise
- # So we now have a RefValue. Let's, keep a reference to the instance
- # we just created, in a dict indexed by the dictionary path, instead
- # of just a list. The keys are required to resolve dependencies during
- # interpolation
- self._occurrences[path] = ret
- return ret
+ def _wrap_list(self, source, path):
+ return [ self._wrap_value(v, path.new_subpath(k)) for (k, v) in enumerate(source) ]
- def _extend_list(self, cur, new, path):
- if isinstance(cur, list):
- ret = cur
- offset = len(cur)
+ def _wrap_dict(self, source, path):
+ return { k: self._wrap_value(v, path.new_subpath(k)) for k, v in source.iteritems() }
+
+ def _update_value(self, cur, new):
+ if isinstance(cur, Value):
+ values = ValueList(cur, self._settings)
+ elif isinstance(cur, ValueList):
+ values = cur
else:
- ret = [cur]
- offset = 1
+ values = ValueList(Value(cur, self._settings, self._uri), self._settings)
- for i in xrange(len(new)):
- ret.append(self._merge_recurse(None, new[i], path.new_subpath(offset + i)))
- return ret
+ if isinstance(new, Value):
+ values.append(new)
+ elif isinstance(new, ValueList):
+ values.extend(new)
+ else:
+ values.append(Value(new, self._settings, self._uri))
- def _merge_dict(self, cur, new, path, initmerge):
+ return values
+
+ def _merge_dict(self, cur, new, path):
"""Merge a dictionary with another dictionary.
Iterate over keys in new. If this is not an initialization merge and
@@ -139,31 +146,15 @@
"""
- if isinstance(cur, dict):
- ret = cur
- else:
- # nothing sensible to do
- raise TypeError('Cannot merge dict into {0} '
- 'objects'.format(type(cur)))
-
- if self.delimiter is None:
- # a delimiter of None indicates that there is no value
- # processing to be done, and since there is no current
- # value, we do not need to walk the new dictionary:
- ret.update(new)
- return ret
-
- ovrprfx = Parameters.DICT_KEY_OVERRIDE_PREFIX
-
+ ret = cur
for key, newvalue in new.iteritems():
- if key.startswith(ovrprfx) and not initmerge:
- ret[key.lstrip(ovrprfx)] = newvalue
+ if key.startswith(self._settings.dict_key_override_prefix) and not self._keep_overrides:
+ ret[key.lstrip(self._settings.dict_key_override_prefix)] = newvalue
else:
- ret[key] = self._merge_recurse(ret.get(key), newvalue,
- path.new_subpath(key), initmerge)
+ ret[key] = self._merge_recurse(ret.get(key), newvalue, path.new_subpath(key))
return ret
- def _merge_recurse(self, cur, new, path=None, initmerge=False):
+ def _merge_recurse(self, cur, new, path=None):
"""Merge a parameter with another parameter.
Iterate over keys in new. Call _merge_dict, _extend_list, or
@@ -182,23 +173,15 @@
"""
- if path is None:
- path = DictPath(self.delimiter)
- if isinstance(new, dict):
- if cur is None:
- cur = {}
- return self._merge_dict(cur, new, path, initmerge)
-
- elif isinstance(new, list):
- if cur is None:
- cur = []
- return self._extend_list(cur, new, path)
-
+ if cur is None:
+ return new
+ elif isinstance(new, dict) and isinstance(cur, dict):
+ return self._merge_dict(cur, new, path)
else:
- return self._update_scalar(cur, new, path)
+ return self._update_value(cur, new)
- def merge(self, other, initmerge=False):
+ def merge(self, other, wrap=True):
"""Merge function (public edition).
Call _merge_recurse on self with either another Parameter object or a
@@ -212,65 +195,146 @@
"""
+ self._unrendered = None
if isinstance(other, dict):
- self._base = self._merge_recurse(self._base, other,
- None, initmerge)
-
+ if wrap:
+ wrapped = self._wrap_dict(other, DictPath(self._settings.delimiter))
+ else:
+ wrapped = copy.deepcopy(other)
elif isinstance(other, self.__class__):
- self._base = self._merge_recurse(self._base, other._base,
- None, initmerge)
-
+ if wrap:
+ wrapped = self._wrap_dict(other._base, DictPath(self._settings.delimiter))
+ else:
+ wrapped = copy.deepcopy(other._base)
else:
raise TypeError('Cannot merge %s objects into %s' % (type(other),
self.__class__.__name__))
+ self._base = self._merge_recurse(self._base, wrapped, DictPath(self._settings.delimiter))
- def has_unresolved_refs(self):
- return len(self._occurrences) > 0
+ def _render_simple_container(self, container, key, value, path):
+ if isinstance(value, ValueList):
+ if value.is_complex():
+ p = path.new_subpath(key)
+ self._unrendered[p] = True
+ if value.has_inv_query():
+ self._inv_queries.append((p, value))
+ if value.needs_all_envs():
+ self._needs_all_envs = True
+ return
+ else:
+ value = value.merge()
+ if isinstance(value, Value) and value.is_container():
+ value = value.contents()
+ if isinstance(value, dict):
+ self._render_simple_dict(value, path.new_subpath(key))
+ container[key] = value
+ elif isinstance(value, list):
+ self._render_simple_list(value, path.new_subpath(key))
+ container[key] = value
+ elif isinstance(value, Value):
+ if value.is_complex():
+ p = path.new_subpath(key)
+ self._unrendered[p] = True
+ if value.has_inv_query():
+ self._inv_queries.append((p, value))
+ if value.needs_all_envs():
+ self._needs_all_envs = True
+ else:
+ container[key] = value.render(None, None)
- def interpolate(self):
- while self.has_unresolved_refs():
+ def _render_simple_dict(self, dictionary, path):
+ for key, value in dictionary.iteritems():
+ self._render_simple_container(dictionary, key, value, path)
+
+ def _render_simple_list(self, item_list, path):
+ for n, value in enumerate(item_list):
+ self._render_simple_container(item_list, n, value, path)
+
+ def interpolate(self, inventory=None):
+ self._initialise_interpolate()
+ while len(self._unrendered) > 0:
# we could use a view here, but this is simple enough:
# _interpolate_inner removes references from the refs hash after
# processing them, so we cannot just iterate the dict
- path, refvalue = self._occurrences.iteritems().next()
- self._interpolate_inner(path, refvalue)
+ path, v = self._unrendered.iteritems().next()
+ self._interpolate_inner(path, inventory)
+ if self._resolve_errors.have_errors():
+ raise self._resolve_errors
- def _interpolate_inner(self, path, refvalue):
- self._occurrences[path] = True # mark as seen
- for ref in refvalue.get_references():
- path_from_ref = DictPath(self.delimiter, ref)
- try:
- refvalue_inner = self._occurrences[path_from_ref]
+ def initialise_interpolation(self):
+ self._unrendered = None
+ self._initialise_interpolate()
- # If there is no reference, then this will throw a KeyError,
- # look further down where this is caught and execution passed
- # to the next iteration of the loop
- #
- # If we get here, then the ref references another parameter,
- # requiring us to recurse, dereferencing first those refs that
- # are most used and are thus at the leaves of the dependency
- # tree.
+ def _initialise_interpolate(self):
+ if self._unrendered is None:
+ self._unrendered = {}
+ self._inv_queries = []
+ self._needs_all_envs = False
+ self._resolve_errors = ResolveErrorList()
+ self._render_simple_dict(self._base, DictPath(self._settings.delimiter))
- if refvalue_inner is True:
- # every call to _interpolate_inner replaces the value of
- # the saved occurrences of a reference with True.
- # Therefore, if we encounter True instead of a refvalue,
- # it means that we have already processed it and are now
- # faced with a cyclical reference.
- raise InfiniteRecursionError(path, ref)
- self._interpolate_inner(path_from_ref, refvalue_inner)
+ def _interpolate_inner(self, path, inventory):
+ value = path.get_value(self._base)
+ if not isinstance(value, (Value, ValueList)):
+ # references to lists and dicts are only deepcopied when merged
+ # together so it's possible a value with references in a referenced
+ # list or dict has already been visited by _interpolate_inner
+ del self._unrendered[path]
+ return
+ self._unrendered[path] = False
+ self._interpolate_references(path, value, inventory)
+ new = self._interpolate_render_value(path, value, inventory)
+ path.set_value(self._base, new)
+ del self._unrendered[path]
- except KeyError as e:
- # not actually an error, but we are done resolving all
- # dependencies of the current ref, so move on
- continue
-
+ def _interpolate_render_value(self, path, value, inventory):
try:
- new = refvalue.render(self._base)
- path.set_value(self._base, new)
+ new = value.render(self._base, inventory)
+ except ResolveError as e:
+ e.context = path
+ if self._settings.group_errors:
+ self._resolve_errors.add(e)
+ new = None
+ else:
+ raise
- # finally, remove the reference from the occurrences cache
- del self._occurrences[path]
- except UndefinedVariableError as e:
- raise UndefinedVariableError(e.var, path)
+ if isinstance(new, dict):
+ self._render_simple_dict(new, path)
+ elif isinstance(new, list):
+ self._render_simple_list(new, path)
+ return new
+ def _interpolate_references(self, path, value, inventory):
+ all_refs = False
+ while not all_refs:
+ for ref in value.get_references():
+ path_from_ref = DictPath(self._settings.delimiter, ref)
+
+ if path_from_ref in self._unrendered:
+ if self._unrendered[path_from_ref] is False:
+ # every call to _interpolate_inner replaces the value of
+ # self._unrendered[path] with False
+ # Therefore, if we encounter False instead of True,
+ # it means that we have already processed it and are now
+ # faced with a cyclical reference.
+ raise InfiniteRecursionError(path, ref, value.uri())
+ else:
+ self._interpolate_inner(path_from_ref, inventory)
+ else:
+ # ensure ancestor keys are already dereferenced
+ ancestor = DictPath(self._settings.delimiter)
+ for k in path_from_ref.key_parts():
+ ancestor = ancestor.new_subpath(k)
+ if ancestor in self._unrendered:
+ self._interpolate_inner(ancestor, inventory)
+ if value.allRefs():
+ all_refs = True
+ else:
+ # not all references in the value could be calculated previously so
+ # try recalculating references with current context and recursively
+ # call _interpolate_inner if the number of references has increased
+ # Otherwise raise an error
+ old = len(value.get_references())
+ value.assembleRefs(self._base)
+ if old == len(value.get_references()):
+ raise BadReferencesError(value.get_references(), str(path), value.uri())
diff --git a/reclass/datatypes/tests/test_entity.py b/reclass/datatypes/tests/test_entity.py
index 17ec9e8..f398d51 100644
--- a/reclass/datatypes/tests/test_entity.py
+++ b/reclass/datatypes/tests/test_entity.py
@@ -6,91 +6,99 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
-from reclass.datatypes import Entity, Classes, Parameters, Applications
+
+from reclass.settings import Settings
+from reclass.datatypes import Entity, Classes, Parameters, Applications, Exports
+from reclass.errors import ResolveError
import unittest
try:
import unittest.mock as mock
except ImportError:
import mock
+SETTINGS = Settings()
+
@mock.patch.multiple('reclass.datatypes', autospec=True, Classes=mock.DEFAULT,
- Applications=mock.DEFAULT,
- Parameters=mock.DEFAULT)
+ Applications=mock.DEFAULT, Parameters=mock.DEFAULT,
+ Exports=mock.DEFAULT)
class TestEntity(unittest.TestCase):
- def _make_instances(self, Classes, Applications, Parameters):
- return Classes(), Applications(), Parameters()
+ def _make_instances(self, Classes, Applications, Parameters, Exports):
+ return Classes(), Applications(), Parameters({}, SETTINGS, ""), Exports({}, SETTINGS, "")
def test_constructor_default(self, **mocks):
# Actually test the real objects by calling the default constructor,
# all other tests shall pass instances to the constructor
- e = Entity()
+ e = Entity(SETTINGS)
self.assertEqual(e.name, '')
self.assertEqual(e.uri, '')
self.assertIsInstance(e.classes, Classes)
self.assertIsInstance(e.applications, Applications)
self.assertIsInstance(e.parameters, Parameters)
+ self.assertIsInstance(e.exports, Exports)
def test_constructor_empty(self, **types):
instances = self._make_instances(**types)
- e = Entity(*instances)
+ e = Entity(SETTINGS, *instances)
self.assertEqual(e.name, '')
self.assertEqual(e.uri, '')
- cl, al, pl = [getattr(i, '__len__') for i in instances]
+ cl, al, pl, ex = [getattr(i, '__len__') for i in instances]
self.assertEqual(len(e.classes), cl.return_value)
cl.assert_called_once_with()
self.assertEqual(len(e.applications), al.return_value)
al.assert_called_once_with()
self.assertEqual(len(e.parameters), pl.return_value)
pl.assert_called_once_with()
+ self.assertEqual(len(e.exports), pl.return_value)
+ ex.assert_called_once_with()
def test_constructor_empty_named(self, **types):
name = 'empty'
- e = Entity(*self._make_instances(**types), name=name)
+ e = Entity(SETTINGS, *self._make_instances(**types), name=name)
self.assertEqual(e.name, name)
def test_constructor_empty_uri(self, **types):
uri = 'test://uri'
- e = Entity(*self._make_instances(**types), uri=uri)
+ e = Entity(SETTINGS, *self._make_instances(**types), uri=uri)
self.assertEqual(e.uri, uri)
def test_constructor_empty_env(self, **types):
env = 'not base'
- e = Entity(*self._make_instances(**types), environment=env)
+ e = Entity(SETTINGS, *self._make_instances(**types), environment=env)
self.assertEqual(e.environment, env)
def test_equal_empty(self, **types):
instances = self._make_instances(**types)
- self.assertEqual(Entity(*instances), Entity(*instances))
+ self.assertEqual(Entity(SETTINGS, *instances), Entity(SETTINGS, *instances))
for i in instances:
i.__eq__.assert_called_once_with(i)
def test_equal_empty_named(self, **types):
instances = self._make_instances(**types)
- self.assertEqual(Entity(*instances), Entity(*instances))
+ self.assertEqual(Entity(SETTINGS, *instances), Entity(SETTINGS, *instances))
name = 'empty'
- self.assertEqual(Entity(*instances, name=name),
- Entity(*instances, name=name))
+ self.assertEqual(Entity(SETTINGS, *instances, name=name),
+ Entity(SETTINGS, *instances, name=name))
def test_unequal_empty_uri(self, **types):
instances = self._make_instances(**types)
uri = 'test://uri'
- self.assertNotEqual(Entity(*instances, uri=uri),
- Entity(*instances, uri=uri[::-1]))
+ self.assertNotEqual(Entity(SETTINGS, *instances, uri=uri),
+ Entity(SETTINGS, *instances, uri=uri[::-1]))
for i in instances:
i.__eq__.assert_called_once_with(i)
def test_unequal_empty_named(self, **types):
instances = self._make_instances(**types)
name = 'empty'
- self.assertNotEqual(Entity(*instances, name=name),
- Entity(*instances, name=name[::-1]))
+ self.assertNotEqual(Entity(SETTINGS, *instances, name=name),
+ Entity(SETTINGS, *instances, name=name[::-1]))
for i in instances:
i.__eq__.assert_called_once_with(i)
def test_unequal_types(self, **types):
instances = self._make_instances(**types)
- self.assertNotEqual(Entity(*instances, name='empty'),
+ self.assertNotEqual(Entity(SETTINGS, *instances, name='empty'),
None)
for i in instances:
self.assertEqual(i.__eq__.call_count, 0)
@@ -98,7 +106,7 @@
def _test_constructor_wrong_types(self, which_replace, **types):
instances = self._make_instances(**types)
instances[which_replace] = 'Invalid type'
- e = Entity(*instances)
+ e = Entity(SETTINGS, *instances)
def test_constructor_wrong_type_classes(self, **types):
self.assertRaises(TypeError, self._test_constructor_wrong_types, 0)
@@ -111,7 +119,7 @@
def test_merge(self, **types):
instances = self._make_instances(**types)
- e = Entity(*instances)
+ e = Entity(SETTINGS, *instances)
e.merge(e)
for i, fn in zip(instances, ('merge_unique', 'merge_unique', 'merge')):
getattr(i, fn).assert_called_once_with(i)
@@ -119,38 +127,153 @@
def test_merge_newname(self, **types):
instances = self._make_instances(**types)
newname = 'newname'
- e1 = Entity(*instances, name='oldname')
- e2 = Entity(*instances, name=newname)
+ e1 = Entity(SETTINGS, *instances, name='oldname')
+ e2 = Entity(SETTINGS, *instances, name=newname)
e1.merge(e2)
self.assertEqual(e1.name, newname)
def test_merge_newuri(self, **types):
instances = self._make_instances(**types)
newuri = 'test://uri2'
- e1 = Entity(*instances, uri='test://uri1')
- e2 = Entity(*instances, uri=newuri)
+ e1 = Entity(SETTINGS, *instances, uri='test://uri1')
+ e2 = Entity(SETTINGS, *instances, uri=newuri)
e1.merge(e2)
self.assertEqual(e1.uri, newuri)
def test_merge_newenv(self, **types):
instances = self._make_instances(**types)
newenv = 'new env'
- e1 = Entity(*instances, environment='env')
- e2 = Entity(*instances, environment=newenv)
+ e1 = Entity(SETTINGS, *instances, environment='env')
+ e2 = Entity(SETTINGS, *instances, environment=newenv)
e1.merge(e2)
self.assertEqual(e1.environment, newenv)
def test_as_dict(self, **types):
instances = self._make_instances(**types)
- entity = Entity(*instances, name='test', environment='test')
+ entity = Entity(SETTINGS, *instances, name='test', environment='test')
comp = {}
comp['classes'] = instances[0].as_list()
comp['applications'] = instances[1].as_list()
comp['parameters'] = instances[2].as_dict()
+ comp['exports'] = instances[3].as_dict()
comp['environment'] = 'test'
d = entity.as_dict()
self.assertDictEqual(d, comp)
+class TestEntityNoMock(unittest.TestCase):
+
+ def test_exports_with_refs(self):
+ inventory = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}}
+ node3_exports = Exports({'a': '${a}', 'b': '${b}'}, SETTINGS, '')
+ node3_parameters = Parameters({'name': 'node3', 'a': '${c}', 'b': 5}, SETTINGS, '')
+ node3_parameters.merge({'c': 3})
+ node3_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node3_parameters, exports=node3_exports)
+ node3_entity.interpolate_exports()
+ inventory['node3'] = node3_entity.exports.as_dict()
+ r = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}, 'node3': {'a': 3, 'b': 5}}
+ self.assertDictEqual(inventory, r)
+
+ def test_reference_to_an_export(self):
+ inventory = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}}
+ node3_exports = Exports({'a': '${a}', 'b': '${b}'}, SETTINGS, '')
+ node3_parameters = Parameters({'name': 'node3', 'ref': '${exp}', 'a': '${c}', 'b': 5}, SETTINGS, '')
+ node3_parameters.merge({'c': 3, 'exp': '$[ exports:a ]'})
+ node3_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node3_parameters, exports=node3_exports)
+ node3_entity.interpolate_exports()
+ inventory['node3'] = node3_entity.exports.as_dict()
+ node3_entity.interpolate(inventory)
+ res_inv = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}, 'node3': {'a': 3, 'b': 5}}
+ res_params = {'a': 3, 'c': 3, 'b': 5, 'name': 'node3', 'exp': {'node1': 1, 'node3': 3, 'node2': 3}, 'ref': {'node1': 1, 'node3': 3, 'node2': 3}}
+ self.assertDictEqual(node3_parameters.as_dict(), res_params)
+ self.assertDictEqual(inventory, res_inv)
+
+ def test_exports_multiple_nodes(self):
+ node1_exports = Exports({'a': '${a}'}, SETTINGS, '')
+ node1_parameters = Parameters({'name': 'node1', 'a': { 'test': '${b}' }, 'b': 1, 'exp': '$[ exports:a ]'}, SETTINGS, '')
+ node1_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node1_parameters, exports=node1_exports)
+ node2_exports = Exports({'a': '${a}'}, SETTINGS, '')
+ node2_parameters = Parameters({'name': 'node2', 'a': { 'test': '${b}' }, 'b': 2 }, SETTINGS, '')
+ node2_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node2_parameters, exports=node2_exports)
+ node1_entity.initialise_interpolation()
+ node2_entity.initialise_interpolation()
+ queries = node1_entity.parameters.get_inv_queries()
+ for p, q in queries:
+ node1_entity.interpolate_single_export(q)
+ node2_entity.interpolate_single_export(q)
+ res_inv = {'node1': {'a': {'test': 1}}, 'node2': {'a': {'test': 2}}}
+ res_params = {'a': {'test': 1}, 'b': 1, 'name': 'node1', 'exp': {'node1': {'test': 1}, 'node2': {'test': 2}}}
+ inventory = {}
+ inventory['node1'] = node1_entity.exports.as_dict()
+ inventory['node2'] = node2_entity.exports.as_dict()
+ node1_entity.interpolate(inventory)
+ self.assertDictEqual(node1_parameters.as_dict(), res_params)
+ self.assertDictEqual(inventory, res_inv)
+
+ def test_exports_with_ancestor_references(self):
+ inventory = {'node1': {'alpha' : {'beta': {'a': 1, 'b': 2}}}, 'node2': {'alpha' : {'beta': {'a': 3, 'b': 4}}}}
+ node3_exports = Exports({'alpha': '${alpha}'}, SETTINGS, '')
+ node3_parameters = Parameters({'name': 'node3', 'alpha': {'beta' : {'a': 5, 'b': 6}}, 'exp': '$[ exports:alpha:beta ]'}, SETTINGS, '')
+ node3_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node3_parameters, exports=node3_exports)
+ res_params = {'exp': {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}, 'node3': {'a': 5, 'b': 6}}, 'name': 'node3', 'alpha': {'beta': {'a': 5, 'b': 6}}}
+ res_inv = {'node1': {'alpha' : {'beta': {'a': 1, 'b': 2}}}, 'node2': {'alpha' : {'beta': {'a': 3, 'b': 4}}}, 'node3': {'alpha' : {'beta': {'a': 5, 'b': 6}}}}
+ node3_entity.initialise_interpolation()
+ queries = node3_entity.parameters.get_inv_queries()
+ for p, q in queries:
+ node3_entity.interpolate_single_export(q)
+ inventory['node3'] = node3_entity.exports.as_dict()
+ node3_entity.interpolate(inventory)
+ self.assertDictEqual(node3_parameters.as_dict(), res_params)
+ self.assertDictEqual(inventory, res_inv)
+
+ def test_exports_with_nested_references(self):
+ inventory = {'node1': {'alpha': {'a': 1, 'b': 2}}, 'node2': {'alpha': {'a': 3, 'b': 4}}}
+ node3_exports = Exports({'alpha': '${alpha}'}, SETTINGS, '')
+ node3_parameters = Parameters({'name': 'node3', 'alpha': {'a': '${one}', 'b': '${two}'}, 'beta': '$[ exports:alpha ]', 'one': '111', 'two': '${three}', 'three': '123'}, SETTINGS, '')
+ node3_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node3_parameters, exports=node3_exports)
+ res_params = {'beta': {'node1': {'a': 1, 'b': 2}, 'node3': {'a': '111', 'b': '123'}, 'node2': {'a': 3, 'b': 4}}, 'name': 'node3', 'alpha': {'a': '111', 'b': '123'}, 'three': '123', 'two': '123', 'one': '111'}
+ res_inv = {'node1': {'alpha': {'a': 1, 'b': 2}}, 'node2': {'alpha': {'a': 3, 'b': 4}}, 'node3': {'alpha': {'a': '111', 'b': '123'}}}
+ node3_entity.interpolate_exports()
+ inventory['node3'] = node3_entity.exports.as_dict()
+ node3_entity.interpolate(inventory)
+ self.assertDictEqual(node3_parameters.as_dict(), res_params)
+ self.assertDictEqual(inventory, res_inv)
+
+ def test_exports_failed_render(self):
+ node1_exports = Exports({'a': '${a}'}, SETTINGS, '')
+ node1_parameters = Parameters({'name': 'node1', 'a': 1, 'exp': '$[ exports:a ]'}, SETTINGS, '')
+ node1_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node1_parameters, exports=node1_exports)
+ node2_exports = Exports({'a': '${b}'}, SETTINGS, '')
+ node2_parameters = Parameters({'name': 'node2', 'a': 2}, SETTINGS, '')
+ node2_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node2_parameters, exports=node2_exports)
+ node1_entity.initialise_interpolation()
+ node2_entity.initialise_interpolation()
+ queries = node1_entity.parameters.get_inv_queries()
+ with self.assertRaises(ResolveError):
+ for p, q in queries:
+ node1_entity.interpolate_single_export(q)
+ node2_entity.interpolate_single_export(q)
+
+ def test_exports_failed_render_ignore(self):
+ node1_exports = Exports({'a': '${a}'}, SETTINGS, '')
+ node1_parameters = Parameters({'name': 'node1', 'a': 1, 'exp': '$[ +IgnoreErrors exports:a ]'}, SETTINGS, '')
+ node1_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node1_parameters, exports=node1_exports)
+ node2_exports = Exports({'a': '${b}'}, SETTINGS, '')
+ node2_parameters = Parameters({'name': 'node1', 'a': 2}, SETTINGS, '')
+ node2_entity = Entity(SETTINGS, classes=None, applications=None, parameters=node2_parameters, exports=node2_exports)
+ node1_entity.initialise_interpolation()
+ node2_entity.initialise_interpolation()
+ queries = node1_entity.parameters.get_inv_queries()
+ for p, q in queries:
+ node1_entity.interpolate_single_export(q)
+ node2_entity.interpolate_single_export(q)
+ res_inv = {'node1': {'a': 1}, 'node2': {}}
+ res_params = { 'a': 1, 'name': 'node1', 'exp': {'node1': 1} }
+ inventory = {}
+ inventory['node1'] = node1_entity.exports.as_dict()
+ inventory['node2'] = node2_entity.exports.as_dict()
+ node1_entity.interpolate(inventory)
+ self.assertDictEqual(node1_parameters.as_dict(), res_params)
+ self.assertDictEqual(inventory, res_inv)
if __name__ == '__main__':
unittest.main()
diff --git a/reclass/datatypes/tests/test_exports.py b/reclass/datatypes/tests/test_exports.py
new file mode 100644
index 0000000..33eccbe
--- /dev/null
+++ b/reclass/datatypes/tests/test_exports.py
@@ -0,0 +1,101 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass (http://github.com/madduck/reclass)
+#
+
+from reclass.settings import Settings
+from reclass.datatypes import Exports, Parameters
+from reclass.errors import ParseError
+import unittest
+
+SETTINGS = Settings()
+
+class TestInvQuery(unittest.TestCase):
+
+ def test_overwrite_method(self):
+ e = Exports({'alpha': { 'one': 1, 'two': 2}}, SETTINGS, '')
+ d = {'alpha': { 'three': 3, 'four': 4}}
+ e.overwrite(d)
+ e.initialise_interpolation()
+ self.assertEqual(e.as_dict(), d)
+
+ def test_malformed_invquery(self):
+ with self.assertRaises(ParseError):
+ p = Parameters({'exp': '$[ exports:a exports:b == self:test_value ]'}, SETTINGS, '')
+ with self.assertRaises(ParseError):
+ p = Parameters({'exp': '$[ exports:a if exports:b self:test_value ]'}, SETTINGS, '')
+ with self.assertRaises(ParseError):
+ p = Parameters({'exp': '$[ exports:a if exports:b == ]'}, SETTINGS, '')
+ with self.assertRaises(ParseError):
+ p = Parameters({'exp': '$[ exports:a if exports:b == self:test_value and exports:c = self:test_value2 ]'}, SETTINGS, '')
+ with self.assertRaises(ParseError):
+ p = Parameters({'exp': '$[ exports:a if exports:b == self:test_value or exports:c == ]'}, SETTINGS, '')
+ with self.assertRaises(ParseError):
+ p = Parameters({'exp': '$[ exports:a if exports:b == self:test_value anddd exports:c == self:test_value2 ]'}, SETTINGS, '')
+
+ def test_value_expr_invquery(self):
+ e = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}}
+ p = Parameters({'exp': '$[ exports:a ]'}, SETTINGS, '')
+ r = {'exp': {'node1': 1, 'node2': 3}}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_if_expr_invquery(self):
+ e = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}}
+ p = Parameters({'exp': '$[ exports:a if exports:b == 4 ]'}, SETTINGS, '')
+ r = {'exp': {'node2': 3}}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_if_expr_invquery_with_refs(self):
+ e = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 4}}
+ p = Parameters({'exp': '$[ exports:a if exports:b == self:test_value ]', 'test_value': 2}, SETTINGS, '')
+ r = {'exp': {'node1': 1}, 'test_value': 2}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_list_if_expr_invquery(self):
+ e = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 3}, 'node3': {'a': 3, 'b': 2}}
+ p = Parameters({'exp': '$[ if exports:b == 2 ]'}, SETTINGS, '')
+ r = {'exp': ['node1', 'node3']}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_if_expr_invquery_wth_and(self):
+ e = {'node1': {'a': 1, 'b': 4, 'c': False}, 'node2': {'a': 3, 'b': 4, 'c': True}}
+ p = Parameters({'exp': '$[ exports:a if exports:b == 4 and exports:c == True ]'}, SETTINGS, '')
+ r = {'exp': {'node2': 3}}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_if_expr_invquery_wth_or(self):
+ e = {'node1': {'a': 1, 'b': 4}, 'node2': {'a': 3, 'b': 3}}
+ p = Parameters({'exp': '$[ exports:a if exports:b == 4 or exports:b == 3 ]'}, SETTINGS, '')
+ r = {'exp': {'node1': 1, 'node2': 3}}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_list_if_expr_invquery_with_and(self):
+ e = {'node1': {'a': 1, 'b': 2, 'c': 'green'}, 'node2': {'a': 3, 'b': 3}, 'node3': {'a': 3, 'b': 2, 'c': 'red'}}
+ p = Parameters({'exp': '$[ if exports:b == 2 and exports:c == green ]'}, SETTINGS, '')
+ r = {'exp': ['node1']}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_list_if_expr_invquery_with_and_missing(self):
+ e = {'node1': {'a': 1, 'b': 2, 'c': 'green'}, 'node2': {'a': 3, 'b': 3}, 'node3': {'a': 3, 'b': 2}}
+ p = Parameters({'exp': '$[ if exports:b == 2 and exports:c == green ]'}, SETTINGS, '')
+ r = {'exp': ['node1']}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+ def test_list_if_expr_invquery_with_and(self):
+ e = {'node1': {'a': 1, 'b': 2}, 'node2': {'a': 3, 'b': 3}, 'node3': {'a': 3, 'b': 4}}
+ p = Parameters({'exp': '$[ if exports:b == 2 or exports:b == 4 ]'}, SETTINGS, '')
+ r = {'exp': ['node1', 'node3']}
+ p.interpolate(e)
+ self.assertEqual(p.as_dict(), r)
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/reclass/datatypes/tests/test_parameters.py b/reclass/datatypes/tests/test_parameters.py
index 5100639..577bdc4 100644
--- a/reclass/datatypes/tests/test_parameters.py
+++ b/reclass/datatypes/tests/test_parameters.py
@@ -6,9 +6,12 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
+
+import copy
+
+from reclass.settings import Settings
from reclass.datatypes import Parameters
-from reclass.defaults import PARAMETER_INTERPOLATION_SENTINELS
-from reclass.errors import InfiniteRecursionError
+from reclass.errors import InfiniteRecursionError, InterpolationError, ResolveError, ResolveErrorList
import unittest
try:
import unittest.mock as mock
@@ -16,15 +19,29 @@
import mock
SIMPLE = {'one': 1, 'two': 2, 'three': 3}
+SETTINGS = Settings()
+
+class MockDevice(object):
+ def __init__(self):
+ self._text = ''
+
+ def write(self, s):
+ self._text += s
+ return
+
+ def text(self):
+ return self._text
class TestParameters(unittest.TestCase):
- def _construct_mocked_params(self, iterable=None, delimiter=None):
- p = Parameters(iterable, delimiter)
+ def _construct_mocked_params(self, iterable=None, settings=SETTINGS):
+ p = Parameters(iterable, settings, '')
self._base = base = p._base
p._base = mock.MagicMock(spec_set=dict, wraps=base)
p._base.__repr__ = mock.MagicMock(autospec=dict.__repr__,
return_value=repr(base))
+ p._base.__getitem__.side_effect = base.__getitem__
+ p._base.__setitem__.side_effect = base.__setitem__
return p, p._base
def test_len_empty(self):
@@ -44,22 +61,13 @@
def test_repr_empty(self):
p, b = self._construct_mocked_params()
b.__repr__.return_value = repr({})
- self.assertEqual('%r' % p, '%s(%r, %r)' % (p.__class__.__name__, {},
- Parameters.DEFAULT_PATH_DELIMITER))
+ self.assertEqual('%r' % p, '%s(%r)' % (p.__class__.__name__, {}))
b.__repr__.assert_called_once_with()
def test_repr(self):
p, b = self._construct_mocked_params(SIMPLE)
b.__repr__.return_value = repr(SIMPLE)
- self.assertEqual('%r' % p, '%s(%r, %r)' % (p.__class__.__name__, SIMPLE,
- Parameters.DEFAULT_PATH_DELIMITER))
- b.__repr__.assert_called_once_with()
-
- def test_repr_delimiter(self):
- delim = '%'
- p, b = self._construct_mocked_params(SIMPLE, delim)
- b.__repr__.return_value = repr(SIMPLE)
- self.assertEqual('%r' % p, '%s(%r, %r)' % (p.__class__.__name__, SIMPLE, delim))
+ self.assertEqual('%r' % p, '%s(%r)' % (p.__class__.__name__, SIMPLE))
b.__repr__.assert_called_once_with()
def test_equal_empty(self):
@@ -71,8 +79,7 @@
def test_equal_default_delimiter(self):
p1, b1 = self._construct_mocked_params(SIMPLE)
- p2, b2 = self._construct_mocked_params(SIMPLE,
- Parameters.DEFAULT_PATH_DELIMITER)
+ p2, b2 = self._construct_mocked_params(SIMPLE, SETTINGS)
b1.__eq__.return_value = True
self.assertEqual(p1, p2)
b1.__eq__.assert_called_once_with(b2)
@@ -92,8 +99,10 @@
b1.__eq__.assert_called_once_with(b2)
def test_unequal_delimiter(self):
- p1, b1 = self._construct_mocked_params(delimiter=':')
- p2, b2 = self._construct_mocked_params(delimiter='%')
+ settings1 = Settings({'delimiter': ':'})
+ settings2 = Settings({'delimiter': '%'})
+ p1, b1 = self._construct_mocked_params(settings=settings1)
+ p2, b2 = self._construct_mocked_params(settings=settings2)
b1.__eq__.return_value = False
self.assertNotEqual(p1, p2)
b1.__eq__.assert_called_once_with(b2)
@@ -114,6 +123,7 @@
def test_get_dict(self):
p, b = self._construct_mocked_params(SIMPLE)
+ p.initialise_interpolation()
self.assertDictEqual(p.as_dict(), SIMPLE)
def test_merge_scalars(self):
@@ -121,6 +131,7 @@
mergee = {'five':5,'four':4,'None':None,'tuple':(1,2,3)}
p2, b2 = self._construct_mocked_params(mergee)
p1.merge(p2)
+ p1.initialise_interpolation()
for key, value in mergee.iteritems():
# check that each key, value in mergee resulted in a get call and
# a __setitem__ call against b1 (the merge target)
@@ -128,26 +139,29 @@
self.assertIn(mock.call(key, value), b1.__setitem__.call_args_list)
def test_stray_occurrence_overwrites_during_interpolation(self):
- p1 = Parameters({'r' : mock.sentinel.ref, 'b': '${r}'})
- p2 = Parameters({'b' : mock.sentinel.goal})
+ p1 = Parameters({'r' : mock.sentinel.ref, 'b': '${r}'}, SETTINGS, '')
+ p2 = Parameters({'b' : mock.sentinel.goal}, SETTINGS, '')
p1.merge(p2)
p1.interpolate()
self.assertEqual(p1.as_dict()['b'], mock.sentinel.goal)
+
class TestParametersNoMock(unittest.TestCase):
def test_merge_scalars(self):
- p = Parameters(SIMPLE)
+ p = Parameters(SIMPLE, SETTINGS, '')
mergee = {'five':5,'four':4,'None':None,'tuple':(1,2,3)}
p.merge(mergee)
+ p.initialise_interpolation()
goal = SIMPLE.copy()
goal.update(mergee)
self.assertDictEqual(p.as_dict(), goal)
def test_merge_scalars_overwrite(self):
- p = Parameters(SIMPLE)
+ p = Parameters(SIMPLE, SETTINGS, '')
mergee = {'two':5,'four':4,'three':None,'one':(1,2,3)}
p.merge(mergee)
+ p.initialise_interpolation()
goal = SIMPLE.copy()
goal.update(mergee)
self.assertDictEqual(p.as_dict(), goal)
@@ -155,35 +169,80 @@
def test_merge_lists(self):
l1 = [1,2,3]
l2 = [2,3,4]
- p1 = Parameters(dict(list=l1[:]))
- p2 = Parameters(dict(list=l2))
+ p1 = Parameters(dict(list=l1[:]), SETTINGS, '')
+ p2 = Parameters(dict(list=l2), SETTINGS, '')
p1.merge(p2)
+ p1.initialise_interpolation()
self.assertListEqual(p1.as_dict()['list'], l1+l2)
def test_merge_list_into_scalar(self):
+ settings = Settings({'allow_list_over_scalar': True})
l = ['foo', 1, 2]
- p1 = Parameters(dict(key=l[0]))
- p1.merge(Parameters(dict(key=l[1:])))
+ p1 = Parameters(dict(key=l[0]), settings, '')
+ p2 = Parameters(dict(key=l[1:]), settings, '')
+ p1.merge(p2)
+ p1.initialise_interpolation()
self.assertListEqual(p1.as_dict()['key'], l)
def test_merge_scalar_over_list(self):
l = ['foo', 1, 2]
- p1 = Parameters(dict(key=l[:2]))
- p1.merge(Parameters(dict(key=l[2])))
+ settings = Settings({'allow_scalar_over_list': True})
+ p1 = Parameters(dict(key=l[:2]), settings, '')
+ p2 = Parameters(dict(key=l[2]), settings, '')
+ p1.merge(p2)
+ p1.initialise_interpolation()
self.assertEqual(p1.as_dict()['key'], l[2])
+ def test_merge_none_over_list(self):
+ l = ['foo', 1, 2]
+ settings = Settings({'allow_none_override': True})
+ p1 = Parameters(dict(key=l[:2]), settings, '')
+ p2 = Parameters(dict(key=None), settings, '')
+ p1.merge(p2)
+ p1.initialise_interpolation()
+ self.assertEqual(p1.as_dict()['key'], None)
+
+ def test_merge_none_over_dict(self):
+ settings = Settings({'allow_none_override': True})
+ p1 = Parameters(dict(key=SIMPLE), settings, '')
+ p2 = Parameters(dict(key=None), settings, '')
+ p1.merge(p2)
+ p1.initialise_interpolation()
+ self.assertEqual(p1.as_dict()['key'], None)
+
+ # def test_merge_bare_dict_over_dict(self):
+ # settings = Settings({'allow_bare_override': True})
+ # p1 = Parameters(dict(key=SIMPLE), settings, '')
+ # p2 = Parameters(dict(key=dict()), settings, '')
+ # p1.merge(p2)
+ # p1.initialise_interpolation()
+ # self.assertEqual(p1.as_dict()['key'], {})
+
+ # def test_merge_bare_list_over_list(self):
+ # l = ['foo', 1, 2]
+ # settings = Settings({'allow_bare_override': True})
+ # p1 = Parameters(dict(key=l), settings, '')
+ # p2 = Parameters(dict(key=list()), settings, '')
+ # p1.merge(p2)
+ # p1.initialise_interpolation()
+ # self.assertEqual(p1.as_dict()['key'], [])
+
def test_merge_dicts(self):
mergee = {'five':5,'four':4,'None':None,'tuple':(1,2,3)}
- p = Parameters(dict(dict=SIMPLE))
- p.merge(Parameters(dict(dict=mergee)))
+ p = Parameters(dict(dict=SIMPLE), SETTINGS, '')
+ p2 = Parameters(dict(dict=mergee), SETTINGS, '')
+ p.merge(p2)
+ p.initialise_interpolation()
goal = SIMPLE.copy()
goal.update(mergee)
self.assertDictEqual(p.as_dict(), dict(dict=goal))
def test_merge_dicts_overwrite(self):
mergee = {'two':5,'four':4,'three':None,'one':(1,2,3)}
- p = Parameters(dict(dict=SIMPLE))
- p.merge(Parameters(dict(dict=mergee)))
+ p = Parameters(dict(dict=SIMPLE), SETTINGS, '')
+ p2 = Parameters(dict(dict=mergee), SETTINGS, '')
+ p.merge(p2)
+ p.initialise_interpolation()
goal = SIMPLE.copy()
goal.update(mergee)
self.assertDictEqual(p.as_dict(), dict(dict=goal))
@@ -196,62 +255,374 @@
'two': ['delta']}
goal = {'one': {'a': 'alpha'},
'two': ['gamma']}
- p = Parameters(dict(dict=base))
- p.merge(Parameters(dict(dict=mergee)))
+ p = Parameters(dict(dict=base), SETTINGS, '')
+ p2 = Parameters(dict(dict=mergee), SETTINGS, '')
+ p.merge(p2)
+ p.initialise_interpolation()
self.assertDictEqual(p.as_dict(), dict(dict=goal))
def test_merge_dict_into_scalar(self):
- p = Parameters(dict(base='foo'))
+ p = Parameters(dict(base='foo'), SETTINGS, '')
+ p2 = Parameters(dict(base=SIMPLE), SETTINGS, '')
with self.assertRaises(TypeError):
- p.merge(Parameters(dict(base=SIMPLE)))
+ p.merge(p2)
+ p.interpolate()
def test_merge_scalar_over_dict(self):
- p = Parameters(dict(base=SIMPLE))
+ settings = Settings({'allow_scalar_over_dict': True})
+ p = Parameters(dict(base=SIMPLE), settings, '')
mergee = {'base':'foo'}
- p.merge(Parameters(mergee))
+ p2 = Parameters(mergee, settings, '')
+ p.merge(p2)
+ p.initialise_interpolation()
self.assertDictEqual(p.as_dict(), mergee)
def test_interpolate_single(self):
v = 42
- d = {'foo': 'bar'.join(PARAMETER_INTERPOLATION_SENTINELS),
+ d = {'foo': 'bar'.join(SETTINGS.reference_sentinels),
'bar': v}
- p = Parameters(d)
+ p = Parameters(d, SETTINGS, '')
p.interpolate()
self.assertEqual(p.as_dict()['foo'], v)
def test_interpolate_multiple(self):
v = '42'
- d = {'foo': 'bar'.join(PARAMETER_INTERPOLATION_SENTINELS) + 'meep'.join(PARAMETER_INTERPOLATION_SENTINELS),
+ d = {'foo': 'bar'.join(SETTINGS.reference_sentinels) + 'meep'.join(SETTINGS.reference_sentinels),
'bar': v[0],
'meep': v[1]}
- p = Parameters(d)
+ p = Parameters(d, SETTINGS, '')
p.interpolate()
self.assertEqual(p.as_dict()['foo'], v)
def test_interpolate_multilevel(self):
v = 42
- d = {'foo': 'bar'.join(PARAMETER_INTERPOLATION_SENTINELS),
- 'bar': 'meep'.join(PARAMETER_INTERPOLATION_SENTINELS),
+ d = {'foo': 'bar'.join(SETTINGS.reference_sentinels),
+ 'bar': 'meep'.join(SETTINGS.reference_sentinels),
'meep': v}
- p = Parameters(d)
+ p = Parameters(d, SETTINGS, '')
p.interpolate()
self.assertEqual(p.as_dict()['foo'], v)
def test_interpolate_list(self):
- l = [41,42,43]
- d = {'foo': 'bar'.join(PARAMETER_INTERPOLATION_SENTINELS),
+ l = [41, 42, 43]
+ d = {'foo': 'bar'.join(SETTINGS.reference_sentinels),
'bar': l}
- p = Parameters(d)
+ p = Parameters(d, SETTINGS, '')
p.interpolate()
self.assertEqual(p.as_dict()['foo'], l)
def test_interpolate_infrecursion(self):
v = 42
- d = {'foo': 'bar'.join(PARAMETER_INTERPOLATION_SENTINELS),
- 'bar': 'foo'.join(PARAMETER_INTERPOLATION_SENTINELS)}
- p = Parameters(d)
+ d = {'foo': 'bar'.join(SETTINGS.reference_sentinels),
+ 'bar': 'foo'.join(SETTINGS.reference_sentinels)}
+ p = Parameters(d, SETTINGS, '')
with self.assertRaises(InfiniteRecursionError):
p.interpolate()
+ def test_nested_references(self):
+ d = {'a': '${${z}}', 'b': 2, 'z': 'b'}
+ r = {'a': 2, 'b': 2, 'z': 'b'}
+ p = Parameters(d, SETTINGS, '')
+ p.interpolate()
+ self.assertEqual(p.as_dict(), r)
+
+ def test_nested_deep_references(self):
+ d = {'one': { 'a': 1, 'b': '${one:${one:c}}', 'c': 'a' } }
+ r = {'one': { 'a': 1, 'b': 1, 'c': 'a'} }
+ p = Parameters(d, SETTINGS, '')
+ p.interpolate()
+ self.assertEqual(p.as_dict(), r)
+
+ def test_stray_occurrence_overwrites_during_interpolation(self):
+ p1 = Parameters({'r' : 1, 'b': '${r}'}, SETTINGS, '')
+ p2 = Parameters({'b' : 2}, SETTINGS, '')
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict()['b'], 2)
+
+ def test_referenced_dict_deep_overwrite(self):
+ p1 = Parameters({'alpha': {'one': {'a': 1, 'b': 2} } }, SETTINGS, '')
+ p2 = Parameters({'beta': '${alpha}'}, SETTINGS, '')
+ p3 = Parameters({'alpha': {'one': {'c': 3, 'd': 4} },
+ 'beta': {'one': {'a': 99} } }, SETTINGS, '')
+ r = {'alpha': {'one': {'a':1, 'b': 2, 'c': 3, 'd':4} },
+ 'beta': {'one': {'a':99, 'b': 2, 'c': 3, 'd':4} } }
+ p1.merge(p2)
+ p1.merge(p3)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_complex_reference_overwriting(self):
+ p1 = Parameters({'one': 'abc_123_${two}_${three}', 'two': 'XYZ', 'four': 4}, SETTINGS, '')
+ p2 = Parameters({'one': 'QWERTY_${three}_${four}', 'three': '999'}, SETTINGS, '')
+ r = {'one': 'QWERTY_999_4', 'two': 'XYZ', 'three': '999', 'four': 4}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_nested_reference_with_overwriting(self):
+ p1 = Parameters({'one': {'a': 1, 'b': 2, 'z': 'a'},
+ 'two': '${one:${one:z}}' }, SETTINGS, '')
+ p2 = Parameters({'one': {'z': 'b'} }, SETTINGS, '')
+ r = {'one': {'a': 1, 'b':2, 'z': 'b'}, 'two': 2}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_merge_referenced_lists(self):
+ p1 = Parameters({'one': [ 1, 2, 3 ], 'two': [ 4, 5, 6 ], 'three': '${one}'}, SETTINGS, '')
+ p2 = Parameters({'three': '${two}'}, SETTINGS, '')
+ r = {'one': [ 1, 2, 3 ], 'two': [ 4, 5, 6], 'three': [ 1, 2, 3, 4, 5, 6 ]}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_merge_referenced_dicts(self):
+ p1 = Parameters({'one': {'a': 1, 'b': 2}, 'two': {'c': 3, 'd': 4}, 'three': '${one}'}, SETTINGS, '')
+ p2 = Parameters({'three': '${two}'}, SETTINGS, '')
+ r = {'one': {'a': 1, 'b': 2}, 'two': {'c': 3, 'd': 4}, 'three': {'a': 1, 'b': 2, 'c': 3, 'd': 4}}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_deep_refs_in_referenced_dicts(self):
+ p = Parameters({'A': '${C:a}', 'B': {'a': 1, 'b': 2}, 'C': '${B}'}, SETTINGS, '')
+ r = {'A': 1, 'B': {'a': 1, 'b': 2}, 'C': {'a': 1, 'b': 2}}
+ p.interpolate()
+ self.assertEqual(p.as_dict(), r)
+
+ def test_overwrite_none(self):
+ p1 = Parameters({'A': None, 'B': None, 'C': None, 'D': None, 'E': None, 'F': None}, SETTINGS, '')
+ p2 = Parameters({'A': 'abc', 'B': [1, 2, 3], 'C': {'a': 'aaa', 'b': 'bbb'}, 'D': '${A}', 'E': '${B}', 'F': '${C}'}, SETTINGS, '')
+ r = {'A': 'abc', 'B': [1, 2, 3], 'C': {'a': 'aaa', 'b': 'bbb'}, 'D': 'abc', 'E': [1, 2, 3], 'F': {'a': 'aaa', 'b': 'bbb'}}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_interpolate_escaping(self):
+ v = 'bar'.join(SETTINGS.reference_sentinels)
+ d = {'foo': SETTINGS.escape_character + 'bar'.join(SETTINGS.reference_sentinels),
+ 'bar': 'unused'}
+ p = Parameters(d, SETTINGS, '')
+ p.initialise_interpolation()
+ self.assertEqual(p.as_dict()['foo'], v)
+
+ def test_interpolate_double_escaping(self):
+ v = SETTINGS.escape_character + 'meep'
+ d = {'foo': SETTINGS.escape_character + SETTINGS.escape_character + 'bar'.join(SETTINGS.reference_sentinels),
+ 'bar': 'meep'}
+ p = Parameters(d, SETTINGS, '')
+ p.interpolate()
+ self.assertEqual(p.as_dict()['foo'], v)
+
+ def test_interpolate_escaping_backwards_compatibility(self):
+ """In all following cases, escaping should not happen and the escape character
+ needs to be printed as-is, to ensure backwards compatibility to older versions."""
+ v = ' '.join([
+ # Escape character followed by unescapable character
+ '1', SETTINGS.escape_character,
+ # Escape character followed by escape character
+ '2', SETTINGS.escape_character + SETTINGS.escape_character,
+ # Escape character followed by interpolation end sentinel
+ '3', SETTINGS.escape_character + SETTINGS.reference_sentinels[1],
+ # Escape character at the end of the string
+ '4', SETTINGS.escape_character
+ ])
+ d = {'foo': v}
+ p = Parameters(d, SETTINGS, '')
+ p.initialise_interpolation()
+ self.assertEqual(p.as_dict()['foo'], v)
+
+ def test_escape_close_in_ref(self):
+ p1 = Parameters({'one}': 1, 'two': '${one\\}}'}, SETTINGS, '')
+ r = {'one}': 1, 'two': 1}
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_double_escape_in_ref(self):
+ d = {'one\\': 1, 'two': '${one\\\\}'}
+ p1 = Parameters(d, SETTINGS, '')
+ r = {'one\\': 1, 'two': 1}
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_merging_for_multiple_nodes(self):
+ p1 = Parameters({ 'alpha': { 'one': 111 }}, SETTINGS, '')
+ p2 = Parameters({ 'beta': {'two': '${alpha:one}' }}, SETTINGS, '')
+ p3 = Parameters({ 'beta': {'two': 222 }}, SETTINGS, '')
+ n1 = Parameters({ 'name': 'node1'}, SETTINGS, '')
+ r1 = { 'alpha': { 'one': 111 }, 'beta': { 'two': 111 }, 'name': 'node1' }
+ r2 = { 'alpha': { 'one': 111 }, 'beta': { 'two': 222 }, 'name': 'node2' }
+ n1.merge(p1)
+ n1.merge(p2)
+ n1.interpolate()
+ n2 = Parameters({'name': 'node2'}, SETTINGS, '')
+ n2.merge(p1)
+ n2.merge(p2)
+ n2.merge(p3)
+ n2.interpolate()
+ self.assertEqual(n1.as_dict(), r1)
+ self.assertEqual(n2.as_dict(), r2)
+
+ def test_list_merging_for_multiple_nodes(self):
+ p1 = Parameters({ 'alpha': { 'one': [1, 2] }}, SETTINGS, '')
+ p2 = Parameters({ 'beta': {'two': '${alpha:one}' }}, SETTINGS, '')
+ p3 = Parameters({ 'beta': {'two': [3] }}, SETTINGS, '')
+ n1 = Parameters({ 'name': 'node1'}, SETTINGS, '')
+ r1 = { 'alpha': { 'one': [1, 2] }, 'beta': { 'two': [1, 2] }, 'name': 'node1' }
+ r2 = { 'alpha': { 'one': [1, 2] }, 'beta': { 'two': [1, 2, 3] }, 'name': 'node2' }
+ n1.merge(p1)
+ n1.merge(p2)
+ n1.interpolate()
+ n2 = Parameters({'name': 'node2'}, SETTINGS, '')
+ n2.merge(p1)
+ n2.merge(p2)
+ n2.merge(p3)
+ n2.interpolate()
+ self.assertEqual(n1.as_dict(), r1)
+ self.assertEqual(n2.as_dict(), r2)
+
+ def test_dict_merging_for_multiple_nodes(self):
+ p1 = Parameters({ 'alpha': { 'one': { 'a': 'aa', 'b': 'bb' }}}, SETTINGS, '')
+ p2 = Parameters({ 'beta': {'two': '${alpha:one}' }}, SETTINGS, '')
+ p3 = Parameters({ 'beta': {'two': {'c': 'cc' }}}, SETTINGS, '')
+ n1 = Parameters({ 'name': 'node1'}, SETTINGS, '')
+ r1 = { 'alpha': { 'one': {'a': 'aa', 'b': 'bb'} }, 'beta': { 'two': {'a': 'aa', 'b': 'bb'} }, 'name': 'node1' }
+ r2 = { 'alpha': { 'one': {'a': 'aa', 'b': 'bb'} }, 'beta': { 'two': {'a': 'aa', 'b': 'bb', 'c': 'cc'} }, 'name': 'node2' }
+ n1.merge(p1)
+ n1.merge(p2)
+ n1.interpolate()
+ n2 = Parameters({'name': 'node2'}, SETTINGS, '')
+ n2.merge(p1)
+ n2.merge(p2)
+ n2.merge(p3)
+ n2.interpolate()
+ self.assertEqual(n1.as_dict(), r1)
+ self.assertEqual(n2.as_dict(), r2)
+
+ def test_list_merging_with_refs_for_multiple_nodes(self):
+ p1 = Parameters({ 'alpha': { 'one': [1, 2], 'two': [3, 4] }}, SETTINGS, '')
+ p2 = Parameters({ 'beta': { 'three': '${alpha:one}' }}, SETTINGS, '')
+ p3 = Parameters({ 'beta': { 'three': '${alpha:two}' }}, SETTINGS, '')
+ p4 = Parameters({ 'beta': { 'three': '${alpha:one}' }}, SETTINGS, '')
+ n1 = Parameters({ 'name': 'node1' }, SETTINGS, '')
+ r1 = {'alpha': {'one': [1, 2], 'two': [3, 4]}, 'beta': {'three': [1, 2]}, 'name': 'node1'}
+ r2 = {'alpha': {'one': [1, 2], 'two': [3, 4]}, 'beta': {'three': [1, 2, 3, 4, 1, 2]}, 'name': 'node2'}
+ n2 = Parameters({ 'name': 'node2' }, SETTINGS, '')
+ n2.merge(p1)
+ n2.merge(p2)
+ n2.merge(p3)
+ n2.merge(p4)
+ n2.interpolate()
+ n1.merge(p1)
+ n1.merge(p2)
+ n1.interpolate()
+ self.assertEqual(n1.as_dict(), r1)
+ self.assertEqual(n2.as_dict(), r2)
+
+ def test_nested_refs_with_multiple_nodes(self):
+ p1 = Parameters({ 'alpha': { 'one': 1, 'two': 2 } }, SETTINGS, '')
+ p2 = Parameters({ 'beta': { 'three': 'one' } }, SETTINGS, '')
+ p3 = Parameters({ 'beta': { 'three': 'two' } }, SETTINGS, '')
+ p4 = Parameters({ 'beta': { 'four': '${alpha:${beta:three}}' } }, SETTINGS, '')
+ n1 = Parameters({ 'name': 'node1' }, SETTINGS, '')
+ r1 = {'alpha': {'one': 1, 'two': 2}, 'beta': {'three': 'one', 'four': 1}, 'name': 'node1'}
+ r2 = {'alpha': {'one': 1, 'two': 2}, 'beta': {'three': 'two', 'four': 2}, 'name': 'node2'}
+ n1.merge(p1)
+ n1.merge(p4)
+ n1.merge(p2)
+ n1.interpolate()
+ n2 = Parameters({ 'name': 'node2' }, SETTINGS, '')
+ n2.merge(p1)
+ n2.merge(p4)
+ n2.merge(p3)
+ n2.interpolate()
+ self.assertEqual(n1.as_dict(), r1)
+ self.assertEqual(n2.as_dict(), r2)
+
+ def test_nested_refs_error_message(self):
+ # beta is missing, oops
+ p1 = Parameters({'alpha': {'one': 1, 'two': 2}, 'gamma': '${alpha:${beta}}'}, SETTINGS, '')
+ with self.assertRaises(InterpolationError) as error:
+ p1.interpolate()
+ self.assertEqual(error.exception.message, "-> \n Bad references, at gamma\n ${beta}")
+
+ def test_multiple_resolve_errors(self):
+ p1 = Parameters({'alpha': '${gamma}', 'beta': '${gamma}'}, SETTINGS, '')
+ with self.assertRaises(ResolveErrorList) as error:
+ p1.interpolate()
+ self.assertEqual(error.exception.message, "-> \n Cannot resolve ${gamma}, at alpha\n Cannot resolve ${gamma}, at beta")
+
+ def test_force_single_resolve_error(self):
+ settings = copy.deepcopy(SETTINGS)
+ settings.group_errors = False
+ p1 = Parameters({'alpha': '${gamma}', 'beta': '${gamma}'}, settings, '')
+ with self.assertRaises(ResolveError) as error:
+ p1.interpolate()
+ self.assertEqual(error.exception.message, "-> \n Cannot resolve ${gamma}, at alpha")
+
+ def test_ignore_overwriten_missing_reference(self):
+ settings = copy.deepcopy(SETTINGS)
+ settings.ignore_overwritten_missing_references = True
+ p1 = Parameters({'alpha': '${beta}'}, settings, '')
+ p2 = Parameters({'alpha': '${gamma}'}, settings, '')
+ p3 = Parameters({'gamma': 3}, settings, '')
+ r1 = {'alpha': 3, 'gamma': 3}
+ p1.merge(p2)
+ p1.merge(p3)
+ err1 = "[WARNING] Reference '${beta}' undefined\n"
+ with mock.patch('sys.stderr', new=MockDevice()) as std_err:
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r1)
+ self.assertEqual(std_err.text(), err1)
+
+ def test_ignore_overwriten_missing_reference_last_value(self):
+ # an error should be raised if the last reference to be merged
+ # is missing even if ignore_overwritten_missing_references is true
+ settings = copy.deepcopy(SETTINGS)
+ settings.ignore_overwritten_missing_references = True
+ p1 = Parameters({'alpha': '${gamma}'}, settings, '')
+ p2 = Parameters({'alpha': '${beta}'}, settings, '')
+ p3 = Parameters({'gamma': 3}, settings, '')
+ p1.merge(p2)
+ p1.merge(p3)
+ with self.assertRaises(InterpolationError) as error:
+ p1.interpolate()
+ self.assertEqual(error.exception.message, "-> \n Cannot resolve ${beta}, at alpha")
+
+ def test_ignore_overwriten_missing_reference_dict(self):
+ # setting ignore_overwritten_missing_references to true should
+ # not change the behaviour for dicts
+ settings = copy.deepcopy(SETTINGS)
+ settings.ignore_overwritten_missing_references = True
+ p1 = Parameters({'alpha': '${beta}'}, settings, '')
+ p2 = Parameters({'alpha': '${gamma}'}, settings, '')
+ p3 = Parameters({'gamma': {'one': 1, 'two': 2}}, settings, '')
+ err1 = "[WARNING] Reference '${beta}' undefined\n"
+ p1.merge(p2)
+ p1.merge(p3)
+ with self.assertRaises(InterpolationError) as error, mock.patch('sys.stderr', new=MockDevice()) as std_err:
+ p1.interpolate()
+ self.assertEqual(error.exception.message, "-> \n Cannot resolve ${beta}, at alpha")
+ self.assertEqual(std_err.text(), err1)
+
+ def test_escaped_string_in_ref_dict_1(self):
+ # test with escaped string in first dict to be merged
+ p1 = Parameters({'a': { 'one': '${a_ref}' }, 'b': { 'two': '\${not_a_ref}' }, 'c': '${b}', 'a_ref': 123}, SETTINGS, '')
+ p2 = Parameters({'c': '${a}'}, SETTINGS, '')
+ r = { 'a': { 'one': 123 }, 'b': { 'two': '${not_a_ref}' }, 'c': { 'one': 123, 'two': '${not_a_ref}' }, 'a_ref': 123}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
+ def test_escaped_string_in_ref_dict_2(self):
+ # test with escaped string in second dict to be merged
+ p1 = Parameters({'a': { 'one': '${a_ref}' }, 'b': { 'two': '\${not_a_ref}' }, 'c': '${a}', 'a_ref': 123}, SETTINGS, '')
+ p2 = Parameters({'c': '${b}'}, SETTINGS, '')
+ r = { 'a': { 'one': 123 }, 'b': { 'two': '${not_a_ref}' }, 'c': { 'one': 123, 'two': '${not_a_ref}' }, 'a_ref': 123}
+ p1.merge(p2)
+ p1.interpolate()
+ self.assertEqual(p1.as_dict(), r)
+
if __name__ == '__main__':
unittest.main()
diff --git a/reclass/defaults.py b/reclass/defaults.py
index 4d866d4..980bb92 100644
--- a/reclass/defaults.py
+++ b/reclass/defaults.py
@@ -15,9 +15,24 @@
OPT_NODES_URI = 'nodes'
OPT_CLASSES_URI = 'classes'
OPT_PRETTY_PRINT = True
+OPT_GROUP_ERRORS = True
+OPT_NO_REFS = False
OPT_OUTPUT = 'yaml'
+
OPT_IGNORE_CLASS_NOTFOUND = False
-OPT_IGNORE_CLASS_REGEXP = ['.*']
+OPT_IGNORE_CLASS_NOTFOUND_REGEXP = ['.*']
+OPT_IGNORE_CLASS_NOTFOUND_WARNING = True
+
+OPT_IGNORE_OVERWRITTEN_MISSING_REFERENCES = True
+
+OPT_ALLOW_SCALAR_OVER_DICT = False
+OPT_ALLOW_SCALAR_OVER_LIST = False
+OPT_ALLOW_LIST_OVER_SCALAR = False
+OPT_ALLOW_DICT_OVER_SCALAR = False
+OPT_ALLOW_NONE_OVERRIDE = True
+
+OPT_INVENTORY_IGNORE_FAILED_NODE = False
+OPT_INVENTORY_IGNORE_FAILED_RENDER = False
CONFIG_FILE_SEARCH_PATH = [os.getcwd(),
os.path.expanduser('~'),
@@ -26,6 +41,11 @@
]
CONFIG_FILE_NAME = RECLASS_NAME + '-config.yml'
-PARAMETER_INTERPOLATION_SENTINELS = ('${', '}')
+REFERENCE_SENTINELS = ('${', '}')
+EXPORT_SENTINELS = ('$[', ']')
PARAMETER_INTERPOLATION_DELIMITER = ':'
PARAMETER_DICT_KEY_OVERRIDE_PREFIX = '~'
+ESCAPE_CHARACTER = '\\'
+
+AUTOMATIC_RECLASS_PARAMETERS = True
+DEFAULT_ENVIRONMENT = 'base'
diff --git a/reclass/errors.py b/reclass/errors.py
index 9228985..a96c47b 100644
--- a/reclass/errors.py
+++ b/reclass/errors.py
@@ -10,15 +10,19 @@
import posix, sys
import traceback
-from reclass.defaults import PARAMETER_INTERPOLATION_SENTINELS
+from reclass.defaults import REFERENCE_SENTINELS, EXPORT_SENTINELS
class ReclassException(Exception):
- def __init__(self, rc=posix.EX_SOFTWARE, msg=None):
+ def __init__(self, rc=posix.EX_SOFTWARE, msg=None, tbFlag=True):
super(ReclassException, self).__init__()
self._rc = rc
self._msg = msg
- self._traceback = traceback.format_exc()
+ if tbFlag:
+ self._traceback = traceback.format_exc()
+ else:
+ self._traceback = None
+ self._full_traceback = False
message = property(lambda self: self._get_message())
rc = property(lambda self: self._rc)
@@ -33,9 +37,16 @@
return 'No error message provided.'
def exit_with_message(self, out=sys.stderr):
- print >>out, self.message
+ if self._full_traceback:
+ t, v, tb = sys.exc_info()
+ print >>out, 'Full Traceback:'
+ for l in traceback.format_tb(tb):
+ print >>out, l,
+ print >>out
if self._traceback:
print >>out, self._traceback
+ print >>out, self.message
+ print >>out
sys.exit(self.rc)
@@ -92,83 +103,166 @@
def __init__(self, storage, nodename, uri):
super(NodeNotFound, self).__init__(msg=None)
- self._storage = storage
- self._name = nodename
- self._uri = uri
+ self.storage = storage
+ self.name = nodename
+ self.uri = uri
def _get_message(self):
msg = "Node '{0}' not found under {1}://{2}"
- return msg.format(self._name, self._storage, self._uri)
-
-
-class ClassNotFound(NotFoundError):
-
- def __init__(self, storage, classname, uri, nodename=None):
- super(ClassNotFound, self).__init__(msg=None)
- self._storage = storage
- self._name = classname
- self._uri = uri
- self._nodename = nodename
-
- def _get_message(self):
- if self._nodename:
- msg = "Class '{0}' (in ancestry of node '{1}') not found " \
- "under {2}://{3}"
- else:
- msg = "Class '{0}' not found under {2}://{3}"
- return msg.format(self._name, self._nodename, self._storage, self._uri)
-
- def set_nodename(self, nodename):
- self._nodename = nodename
+ return msg.format(self.name, self.storage, self.uri)
class InterpolationError(ReclassException):
- def __init__(self, msg, rc=posix.EX_DATAERR):
- super(InterpolationError, self).__init__(rc=rc, msg=msg)
-
-
-class UndefinedVariableError(InterpolationError):
-
- def __init__(self, var, context=None):
- super(UndefinedVariableError, self).__init__(msg=None)
- self._var = var
- self._context = context
- var = property(lambda self: self._var)
- context = property(lambda self: self._context)
+ def __init__(self, msg, rc=posix.EX_DATAERR, nodename='', uri=None, context=None, tbFlag=True):
+ super(InterpolationError, self).__init__(rc=rc, msg=msg, tbFlag=tbFlag)
+ self.nodename = nodename
+ self.uri = uri
+ self.context = context
def _get_message(self):
- msg = "Cannot resolve " + self._var.join(PARAMETER_INTERPOLATION_SENTINELS)
- if self._context:
- msg += ' in the context of %s' % self._context
+ msg = '-> {0}\n'.format(self.nodename)
+ msg += self._render_error_message(self._get_error_message(), 1)
+ msg = msg[:-1]
return msg
- def set_context(self, context):
- self._context = context
+ def _render_error_message(self, message_list, indent):
+ msg = ''
+ for l in message_list:
+ if isinstance(l, list):
+ msg += self._render_error_message(l, indent + 1)
+ else:
+ msg += (' ' * indent * 3) + l + '\n'
+ return msg
+
+ def _add_context_and_uri(self):
+ msg = ''
+ if self.context:
+ msg += ', at %s' % self.context
+ if self.uri:
+ msg += ', in %s' % self.uri
+ return msg
-class IncompleteInterpolationError(InterpolationError):
+class ClassNotFound(InterpolationError):
- def __init__(self, string, end_sentinel):
- super(IncompleteInterpolationError, self).__init__(msg=None)
- self._ref = string.join(PARAMETER_INTERPOLATION_SENTINELS)
- self._end_sentinel = end_sentinel
+ def __init__(self, storage, classname, path, nodename='', uri=None):
+ super(ClassNotFound, self).__init__(msg=None, uri=uri, nodename=nodename)
+ self.storage = storage
+ self.name = classname
+ self.path = path
- def _get_message(self):
- msg = "Missing '{0}' to end reference: {1}"
- return msg.format(self._end_sentinel, self._ref)
+ def _get_error_message(self):
+ msg = [ 'In {0}'.format(self.uri),
+ 'Class {0} not found under {1}://{2}'.format(self.name, self.storage, self.path) ]
+ return msg
+
+
+class InvQueryClassNotFound(InterpolationError):
+
+ def __init__(self, classNotFoundError, nodename=''):
+ super(InvQueryClassNotFound, self).__init__(msg=None, nodename=nodename)
+ self.classNotFoundError = classNotFoundError
+ self._traceback = self.classNotFoundError._traceback
+
+ def _get_error_message(self):
+ msg = [ 'Inventory Queries:',
+ '-> {0}'.format(self.classNotFoundError.nodename) ]
+ msg.append(self.classNotFoundError._get_error_message())
+ return msg
+
+
+class ResolveError(InterpolationError):
+
+ def __init__(self, reference, uri=None, context=None):
+ super(ResolveError, self).__init__(msg=None)
+ self.reference = reference
+
+ def _get_error_message(self):
+ msg = 'Cannot resolve {0}'.format(self.reference.join(REFERENCE_SENTINELS)) + self._add_context_and_uri()
+ return [ msg ]
+
+class ResolveErrorList(InterpolationError):
+ def __init__(self):
+ super(ResolveErrorList, self).__init__(msg=None)
+ self.resolve_errors = []
+ self._traceback = False
+
+ def add(self, resolve_error):
+ self.resolve_errors.append(resolve_error)
+
+ def have_errors(self):
+ return len(self.resolve_errors) > 0
+
+ def _get_error_message(self):
+ msgs = []
+ for e in self.resolve_errors:
+ msgs.extend(e._get_error_message())
+ return msgs
+
+
+class InvQueryError(InterpolationError):
+
+ def __init__(self, query, resolveError, uri=None, context=None):
+ super(InvQueryError, self).__init__(msg=None)
+ self.query = query
+ self.resolveError = resolveError
+ self._traceback = self.resolveError._traceback
+
+ def _get_error_message(self):
+ msg1 = 'Failed inv query {0}'.format(self.query.join(EXPORT_SENTINELS)) + self._add_context_and_uri()
+ msg2 = '-> {0}'.format(self.resolveError.nodename)
+ msg3 = self.resolveError._get_error_message()
+ return [ msg1, msg2, msg3 ]
+
+
+class ParseError(InterpolationError):
+
+ def __init__(self, msg, line, col, lineno, rc=posix.EX_DATAERR):
+ super(ParseError, self).__init__(rc=rc, msg=None)
+ self._err = msg
+ self._line = line
+ self._col = col
+ self._lineno = lineno
+
+ def _get_error_message(self):
+ msg = [ 'Parse error: {0}'.format(self._line.join(EXPORT_SENTINELS)) + self._add_context_and_uri() ]
+ msg.append('{0} at char {1}'.format(self._err, self._col - 1))
class InfiniteRecursionError(InterpolationError):
- def __init__(self, path, ref):
- super(InfiniteRecursionError, self).__init__(msg=None)
- self._path = path
- self._ref = ref.join(PARAMETER_INTERPOLATION_SENTINELS)
+ def __init__(self, context, ref, uri):
+ super(InfiniteRecursionError, self).__init__(msg=None, tbFlag=False, uri=uri)
+ self.context = context
+ self.ref = ref
- def _get_message(self):
- msg = "Infinite recursion while resolving {0} at {1}"
- return msg.format(self._ref, self._path)
+ def _get_error_message(self):
+ msg = [ 'Infinite recursion: {0}'.format(self.ref.join(REFERENCE_SENTINELS)) + self._add_context_and_uri() ]
+ return msg
+
+
+class BadReferencesError(InterpolationError):
+
+ def __init__(self, refs, context, uri):
+ super(BadReferencesError, self).__init__(msg=None, context=context, uri=uri, tbFlag=False)
+ self.refs = [ r.join(REFERENCE_SENTINELS) for r in refs ]
+
+ def _get_error_message(self):
+ msg = [ 'Bad references' + self._add_context_and_uri(),
+ ' ' + ', '.join(self.refs) ]
+ return msg
+
+
+class ExpressionError(InterpolationError):
+
+ def __init__(self, msg, rc=posix.EX_DATAERR, tbFlag=True):
+ super(ExpressionError, self).__init__(rc=rc, msg=None, tbFlag=tbFlag)
+ self._error_msg = msg
+
+ def _get_error_message(self):
+ msg = [ 'Expression error: {0}'.format(self._error_msg) + self._add_context_and_uri() ]
+ return msg
class MappingError(ReclassException):
diff --git a/reclass/output/json_outputter.py b/reclass/output/json_outputter.py
index dab86ed..8c79039 100644
--- a/reclass/output/json_outputter.py
+++ b/reclass/output/json_outputter.py
@@ -11,7 +11,7 @@
class Outputter(OutputterBase):
- def dump(self, data, pretty_print=False):
+ def dump(self, data, pretty_print=False, no_refs=False):
separators = (',', ': ') if pretty_print else (',', ':')
indent = 2 if pretty_print else None
return json.dumps(data, indent=indent, separators=separators)
diff --git a/reclass/output/yaml_outputter.py b/reclass/output/yaml_outputter.py
index 2c70cc3..9a0d098 100644
--- a/reclass/output/yaml_outputter.py
+++ b/reclass/output/yaml_outputter.py
@@ -11,5 +11,16 @@
class Outputter(OutputterBase):
- def dump(self, data, pretty_print=False):
- return yaml.dump(data, default_flow_style=not pretty_print)
+ def dump(self, data, pretty_print=False, no_refs=False):
+ if (no_refs):
+ return yaml.dump(data, default_flow_style=not pretty_print, Dumper=ExplicitDumper)
+ else:
+ return yaml.dump(data, default_flow_style=not pretty_print)
+
+class ExplicitDumper(yaml.SafeDumper):
+ """
+ A dumper that will never emit aliases.
+ """
+
+ def ignore_aliases(self, data):
+ return True
diff --git a/reclass/settings.py b/reclass/settings.py
new file mode 100644
index 0000000..44c58d8
--- /dev/null
+++ b/reclass/settings.py
@@ -0,0 +1,63 @@
+import copy
+import reclass.values.parser_funcs
+from reclass.defaults import *
+
+class Settings(object):
+
+ def __init__(self, options={}):
+ self.allow_scalar_over_dict = options.get('allow_scalar_over_dict', OPT_ALLOW_SCALAR_OVER_DICT)
+ self.allow_scalar_over_list = options.get('allow_scalar_over_list', OPT_ALLOW_SCALAR_OVER_LIST)
+ self.allow_list_over_scalar = options.get('allow_list_over_scalar', OPT_ALLOW_LIST_OVER_SCALAR)
+ self.allow_dict_over_scalar = options.get('allow_dict_over_scalar', OPT_ALLOW_DICT_OVER_SCALAR)
+ self.allow_none_override = options.get('allow_none_override', OPT_ALLOW_NONE_OVERRIDE)
+ self.automatic_parameters = options.get('automatic_parameters', AUTOMATIC_RECLASS_PARAMETERS)
+ self.default_environment = options.get('default_environment', DEFAULT_ENVIRONMENT)
+ self.delimiter = options.get('delimiter', PARAMETER_INTERPOLATION_DELIMITER)
+ self.dict_key_override_prefix = options.get('dict_key_override_prefix', PARAMETER_DICT_KEY_OVERRIDE_PREFIX)
+ self.escape_character = options.get('escape_character', ESCAPE_CHARACTER)
+ self.export_sentinels = options.get('export_sentinels', EXPORT_SENTINELS)
+ self.inventory_ignore_failed_node = options.get('inventory_ignore_failed_node', OPT_INVENTORY_IGNORE_FAILED_NODE)
+ self.inventory_ignore_failed_render = options.get('inventory_ignore_failed_render', OPT_INVENTORY_IGNORE_FAILED_RENDER)
+ self.reference_sentinels = options.get('reference_sentinels', REFERENCE_SENTINELS)
+ self.ignore_class_notfound = options.get('ignore_class_notfound', OPT_IGNORE_CLASS_NOTFOUND)
+
+ self.ignore_class_notfound_regexp = options.get('ignore_class_notfound_regexp', OPT_IGNORE_CLASS_NOTFOUND_REGEXP)
+ if isinstance(self.ignore_class_notfound_regexp, basestring):
+ self.ignore_class_notfound_regexp = [ self.ignore_class_notfound_regexp ]
+
+ self.ignore_class_notfound_warning = options.get('ignore_class_notfound_warning', OPT_IGNORE_CLASS_NOTFOUND_WARNING)
+ self.ignore_overwritten_missing_references = options.get('ignore_overwritten_missing_references', OPT_IGNORE_OVERWRITTEN_MISSING_REFERENCES)
+
+ self.group_errors = options.get('group_errors', OPT_GROUP_ERRORS)
+
+ self.ref_parser = reclass.values.parser_funcs.get_ref_parser(self.escape_character, self.reference_sentinels, self.export_sentinels)
+ self.simple_ref_parser = reclass.values.parser_funcs.get_simple_ref_parser(self.escape_character, self.reference_sentinels, self.export_sentinels)
+
+ def __eq__(self, other):
+ return isinstance(other, type(self)) \
+ and self.allow_scalar_over_dict == other.allow_scalar_over_dict \
+ and self.allow_scalar_over_list == other.allow_scalar_over_list \
+ and self.allow_list_over_scalar == other.allow_list_over_scalar \
+ and self.allow_dict_over_scalar == other.allow_dict_over_scalar \
+ and self.allow_none_override == other.allow_none_override \
+ and self.automatic_parameters == other.automatic_parameters \
+ and self.default_environment == other.default_environment \
+ and self.delimiter == other.delimiter \
+ and self.dict_key_override_prefix == other.dict_key_override_prefix \
+ and self.escape_character == other.escape_character \
+ and self.export_sentinels == other.export_sentinels \
+ and self.inventory_ignore_failed_node == other.inventory_ignore_failed_node \
+ and self.inventory_ignore_failed_render == other.inventory_ignore_failed_render \
+ and self.reference_sentinels == other.reference_sentinels \
+ and self.ignore_class_notfound == other.ignore_class_notfound \
+ and self.ignore_class_notfound_regexp == other.ignore_class_notfound_regexp \
+ and self.ignore_class_notfound_warning == other.ignore_class_notfound_warning
+
+ def __copy__(self):
+ cls = self.__class__
+ result = cls.__new__(cls)
+ result.__dict__.update(self.__dict__)
+ return result
+
+ def __deepcopy__(self, memo):
+ return self.__copy__()
diff --git a/reclass/storage/__init__.py b/reclass/storage/__init__.py
index 8ae2408..f49ac16 100644
--- a/reclass/storage/__init__.py
+++ b/reclass/storage/__init__.py
@@ -14,14 +14,18 @@
name = property(lambda self: self._name)
- def get_node(self, name, merge_base=None):
+ def get_node(self, name, settings):
msg = "Storage class '{0}' does not implement node entity retrieval."
raise NotImplementedError(msg.format(self.name))
- def get_class(self, name):
+ def get_class(self, name, environment, settings):
msg = "Storage class '{0}' does not implement class entity retrieval."
raise NotImplementedError(msg.format(self.name))
def enumerate_nodes(self):
msg = "Storage class '{0}' does not implement node enumeration."
raise NotImplementedError(msg.format(self.name))
+
+ def path_mangler(self):
+ msg = "Storage class '{0}' does not implement path_mangler."
+ raise NotImplementedError(msg.format(self.name))
diff --git a/reclass/storage/common.py b/reclass/storage/common.py
new file mode 100644
index 0000000..6a77fc8
--- /dev/null
+++ b/reclass/storage/common.py
@@ -0,0 +1,22 @@
+import os
+
+class NameMangler:
+ @staticmethod
+ def nodes(relpath, name):
+ # nodes are identified just by their basename, so
+ # no mangling required
+ return relpath, name
+
+ @staticmethod
+ def classes(relpath, name):
+ if relpath == '.' or relpath == '':
+ # './' is converted to None
+ return None, name
+ parts = relpath.split(os.path.sep)
+ if name != 'init':
+ # "init" is the directory index, so only append the basename
+ # to the path parts for all other filenames. This has the
+ # effect that data in file "foo/init.yml" will be registered
+ # as data for class "foo", not "foo.init"
+ parts.append(name)
+ return relpath, '.'.join(parts)
diff --git a/reclass/storage/loader.py b/reclass/storage/loader.py
index 399e7fd..77fdecb 100644
--- a/reclass/storage/loader.py
+++ b/reclass/storage/loader.py
@@ -23,3 +23,9 @@
'"{1}"'.format(self._name, klassname))
return klass
+
+ def path_mangler(self, name='path_mangler'):
+ function = getattr(self._module, name, None)
+ if function is None:
+ raise AttributeError('Storage backend class {0} does not export "{1}"'.format(self._name, name))
+ return function
diff --git a/reclass/storage/memcache_proxy.py b/reclass/storage/memcache_proxy.py
index 7d9ab5e..405ea8e 100644
--- a/reclass/storage/memcache_proxy.py
+++ b/reclass/storage/memcache_proxy.py
@@ -30,30 +30,27 @@
name = property(lambda self: self._real_storage.name)
- @staticmethod
- def _cache_proxy(name, cache, getter):
+ def get_node(self, name, settings):
+ if not self._cache_nodes:
+ return self._real_storage.get_node(name, settings)
try:
- ret = cache[name]
-
+ return self._nodes_cache[name]
except KeyError, e:
- ret = getter(name)
- cache[name] = ret
-
+ ret = self._real_storage.get_node(name, settings)
+ self._nodes_cache[name] = ret
return ret
- def get_node(self, name):
- if not self._cache_nodes:
- return self._real_storage.get_node(name)
-
- return MemcacheProxy._cache_proxy(name, self._nodes_cache,
- self._real_storage.get_node)
-
- def get_class(self, name):
+ def get_class(self, name, environment, settings):
if not self._cache_classes:
- return self._real_storage.get_class(name)
-
- return MemcacheProxy._cache_proxy(name, self._classes_cache,
- self._real_storage.get_class)
+ return self._real_storage.get_class(name, environment, settings)
+ try:
+ return self._classes_cache[environment][name]
+ except KeyError, e:
+ if environment not in self._classes_cache:
+ self._classes_cache[environment] = dict()
+ ret = self._real_storage.get_class(name, environment, settings)
+ self._classes_cache[environment][name] = ret
+ return ret
def enumerate_nodes(self):
if not self._cache_nodelist:
diff --git a/reclass/storage/mixed/__init__.py b/reclass/storage/mixed/__init__.py
new file mode 100644
index 0000000..4651e00
--- /dev/null
+++ b/reclass/storage/mixed/__init__.py
@@ -0,0 +1,58 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+
+import collections
+import copy
+
+import reclass.errors
+from reclass import get_storage
+from reclass.storage import NodeStorageBase
+
+def path_mangler(inventory_base_uri, nodes_uri, classes_uri):
+ if nodes_uri == classes_uri:
+ raise errors.DuplicateUriError(nodes_uri, classes_uri)
+ return nodes_uri, classes_uri
+
+STORAGE_NAME = 'mixed'
+
+class ExternalNodeStorage(NodeStorageBase):
+
+ MixedUri = collections.namedtuple('MixedURI', 'storage_type options')
+
+ def __init__(self, nodes_uri, classes_uri):
+ super(ExternalNodeStorage, self).__init__(STORAGE_NAME)
+
+ self._nodes_uri = self._uri(nodes_uri)
+ self._nodes_storage = get_storage(self._nodes_uri.storage_type, self._nodes_uri.options, None)
+ self._classes_default_uri = self._uri(classes_uri)
+ self._classes_default_storage = get_storage(self._classes_default_uri.storage_type, None, self._classes_default_uri.options)
+
+ self._classes_storage = dict()
+ if 'env_overrides' in classes_uri:
+ for override in classes_uri['env_overrides']:
+ for env, options in override.iteritems():
+ uri = copy.deepcopy(classes_uri)
+ uri.update(options)
+ uri = self._uri(uri)
+ self._classes_storage[env] = get_storage(uri.storage_type, None, uri.options)
+
+ def _uri(self, uri):
+ ret = copy.deepcopy(uri)
+ ret['storage_type'] = uri['storage_type']
+ if 'env_overrides' in ret:
+ del ret['env_overrides']
+ if uri['storage_type'] == 'yaml_fs':
+ ret = ret['uri']
+ return self.MixedUri(uri['storage_type'], ret)
+
+ def get_node(self, name, settings):
+ return self._nodes_storage.get_node(name, settings)
+
+ def get_class(self, name, environment, settings):
+ storage = self._classes_storage.get(environment, self._classes_default_storage)
+ return storage.get_class(name, environment, settings)
+
+ def enumerate_nodes(self):
+ return self._nodes_storage.enumerate_nodes()
diff --git a/reclass/storage/tests/test_memcache_proxy.py b/reclass/storage/tests/test_memcache_proxy.py
index 066c27e..a47c29d 100644
--- a/reclass/storage/tests/test_memcache_proxy.py
+++ b/reclass/storage/tests/test_memcache_proxy.py
@@ -6,6 +6,8 @@
# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
# Released under the terms of the Artistic Licence 2.0
#
+
+from reclass.settings import Settings
from reclass.storage.memcache_proxy import MemcacheProxy
from reclass.storage import NodeStorageBase
@@ -22,48 +24,48 @@
def test_no_nodes_caching(self):
p = MemcacheProxy(self._storage, cache_nodes=False)
- NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'
+ NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'; SETTINGS = Settings()
self._storage.get_node.return_value = RET
- self.assertEqual(p.get_node(NAME), RET)
- self.assertEqual(p.get_node(NAME), RET)
- self.assertEqual(p.get_node(NAME2), RET)
- self.assertEqual(p.get_node(NAME2), RET)
- expected = [mock.call(NAME), mock.call(NAME),
- mock.call(NAME2), mock.call(NAME2)]
+ self.assertEqual(p.get_node(NAME, SETTINGS), RET)
+ self.assertEqual(p.get_node(NAME, SETTINGS), RET)
+ self.assertEqual(p.get_node(NAME2, SETTINGS), RET)
+ self.assertEqual(p.get_node(NAME2, SETTINGS), RET)
+ expected = [mock.call(NAME, SETTINGS), mock.call(NAME, SETTINGS),
+ mock.call(NAME2, SETTINGS), mock.call(NAME2, SETTINGS)]
self.assertListEqual(self._storage.get_node.call_args_list, expected)
def test_nodes_caching(self):
p = MemcacheProxy(self._storage, cache_nodes=True)
- NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'
+ NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'; SETTINGS = Settings()
self._storage.get_node.return_value = RET
- self.assertEqual(p.get_node(NAME), RET)
- self.assertEqual(p.get_node(NAME), RET)
- self.assertEqual(p.get_node(NAME2), RET)
- self.assertEqual(p.get_node(NAME2), RET)
- expected = [mock.call(NAME), mock.call(NAME2)] # called once each
+ self.assertEqual(p.get_node(NAME, SETTINGS), RET)
+ self.assertEqual(p.get_node(NAME, SETTINGS), RET)
+ self.assertEqual(p.get_node(NAME2, SETTINGS), RET)
+ self.assertEqual(p.get_node(NAME2, SETTINGS), RET)
+ expected = [mock.call(NAME, SETTINGS), mock.call(NAME2, SETTINGS)] # called once each
self.assertListEqual(self._storage.get_node.call_args_list, expected)
def test_no_classes_caching(self):
p = MemcacheProxy(self._storage, cache_classes=False)
- NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'
+ NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'; SETTINGS = Settings()
self._storage.get_class.return_value = RET
- self.assertEqual(p.get_class(NAME), RET)
- self.assertEqual(p.get_class(NAME), RET)
- self.assertEqual(p.get_class(NAME2), RET)
- self.assertEqual(p.get_class(NAME2), RET)
- expected = [mock.call(NAME), mock.call(NAME),
- mock.call(NAME2), mock.call(NAME2)]
+ self.assertEqual(p.get_class(NAME, None, SETTINGS), RET)
+ self.assertEqual(p.get_class(NAME, None, SETTINGS), RET)
+ self.assertEqual(p.get_class(NAME2, None, SETTINGS), RET)
+ self.assertEqual(p.get_class(NAME2, None, SETTINGS), RET)
+ expected = [mock.call(NAME, None, SETTINGS), mock.call(NAME, None, SETTINGS),
+ mock.call(NAME2, None, SETTINGS), mock.call(NAME2, None, SETTINGS)]
self.assertListEqual(self._storage.get_class.call_args_list, expected)
def test_classes_caching(self):
p = MemcacheProxy(self._storage, cache_classes=True)
- NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'
+ NAME = 'foo'; NAME2 = 'bar'; RET = 'baz'; SETTINGS = Settings()
self._storage.get_class.return_value = RET
- self.assertEqual(p.get_class(NAME), RET)
- self.assertEqual(p.get_class(NAME), RET)
- self.assertEqual(p.get_class(NAME2), RET)
- self.assertEqual(p.get_class(NAME2), RET)
- expected = [mock.call(NAME), mock.call(NAME2)] # called once each
+ self.assertEqual(p.get_class(NAME, None, SETTINGS), RET)
+ self.assertEqual(p.get_class(NAME, None, SETTINGS), RET)
+ self.assertEqual(p.get_class(NAME2, None, SETTINGS), RET)
+ self.assertEqual(p.get_class(NAME2, None, SETTINGS), RET)
+ expected = [mock.call(NAME, None, SETTINGS), mock.call(NAME2, None, SETTINGS)] # called once each
self.assertListEqual(self._storage.get_class.call_args_list, expected)
def test_nodelist_no_caching(self):
diff --git a/reclass/storage/tests/test_yamldata.py b/reclass/storage/tests/test_yamldata.py
new file mode 100644
index 0000000..d8129ce
--- /dev/null
+++ b/reclass/storage/tests/test_yamldata.py
@@ -0,0 +1,37 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass (http://github.com/madduck/reclass)
+#
+
+from reclass.storage.yamldata import YamlData
+
+import unittest
+
+class TestYamlData(unittest.TestCase):
+
+ def setUp(self):
+ lines = [ 'classes:',
+ ' - testdir.test1',
+ ' - testdir.test2',
+ ' - test3',
+ '',
+ 'environment: base',
+ '',
+ 'parameters:',
+ ' _TEST_:',
+ ' alpha: 1',
+ ' beta: two' ]
+ self.data = '\n'.join(lines)
+ self.yamldict = { 'classes': [ 'testdir.test1', 'testdir.test2', 'test3' ],
+ 'environment': 'base',
+ 'parameters': { '_TEST_': { 'alpha': 1, 'beta': 'two' } }
+ }
+
+ def test_yaml_from_string(self):
+ res = YamlData.from_string(self.data, 'testpath')
+ self.assertEqual(res.uri, 'testpath')
+ self.assertEqual(res.get_data(), self.yamldict)
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/reclass/storage/yaml_fs/__init__.py b/reclass/storage/yaml_fs/__init__.py
index 5a13050..b92cbfe 100644
--- a/reclass/storage/yaml_fs/__init__.py
+++ b/reclass/storage/yaml_fs/__init__.py
@@ -8,9 +8,12 @@
#
import os, sys
import fnmatch
+import yaml
+from reclass.output.yaml_outputter import ExplicitDumper
from reclass.storage import NodeStorageBase
-from yamlfile import YamlFile
-from directory import Directory
+from reclass.storage.common import NameMangler
+from reclass.storage.yamldata import YamlData
+from .directory import Directory
from reclass.datatypes import Entity
import reclass.errors
@@ -21,34 +24,42 @@
#print >>sys.stderr, msg
pass
+def path_mangler(inventory_base_uri, nodes_uri, classes_uri):
+
+ if inventory_base_uri is None:
+ # if inventory_base is not given, default to current directory
+ inventory_base_uri = os.getcwd()
+
+ nodes_uri = nodes_uri or 'nodes'
+ classes_uri = classes_uri or 'classes'
+
+ def _path_mangler_inner(path):
+ ret = os.path.join(inventory_base_uri, path)
+ ret = os.path.expanduser(ret)
+ return os.path.abspath(ret)
+
+ n, c = map(_path_mangler_inner, (nodes_uri, classes_uri))
+ if n == c:
+ raise errors.DuplicateUriError(n, c)
+ common = os.path.commonprefix((n, c))
+ if common == n or common == c:
+ raise errors.UriOverlapError(n, c)
+
+ return n, c
+
+
class ExternalNodeStorage(NodeStorageBase):
- def __init__(self, nodes_uri, classes_uri, default_environment=None):
+ def __init__(self, nodes_uri, classes_uri):
super(ExternalNodeStorage, self).__init__(STORAGE_NAME)
- def name_mangler(relpath, name):
- # nodes are identified just by their basename, so
- # no mangling required
- return relpath, name
- self._nodes_uri = nodes_uri
- self._nodes = self._enumerate_inventory(nodes_uri, name_mangler)
+ if nodes_uri is not None:
+ self._nodes_uri = nodes_uri
+ self._nodes = self._enumerate_inventory(nodes_uri, NameMangler.nodes)
- def name_mangler(relpath, name):
- if relpath == '.':
- # './' is converted to None
- return None, name
- parts = relpath.split(os.path.sep)
- if name != 'init':
- # "init" is the directory index, so only append the basename
- # to the path parts for all other filenames. This has the
- # effect that data in file "foo/init.yml" will be registered
- # as data for class "foo", not "foo.init"
- parts.append(name)
- return relpath, '.'.join(parts)
- self._classes_uri = classes_uri
- self._classes = self._enumerate_inventory(classes_uri, name_mangler)
-
- self._default_environment = default_environment
+ if classes_uri is not None:
+ self._classes_uri = classes_uri
+ self._classes = self._enumerate_inventory(classes_uri, NameMangler.classes)
nodes_uri = property(lambda self: self._nodes_uri)
classes_uri = property(lambda self: self._classes_uri)
@@ -76,7 +87,7 @@
d.walk(register_fn)
return ret
- def get_node(self, name):
+ def get_node(self, name, settings):
vvv('GET NODE {0}'.format(name))
try:
relpath = self._nodes[name]
@@ -84,16 +95,16 @@
name = os.path.splitext(relpath)[0]
except KeyError, e:
raise reclass.errors.NodeNotFound(self.name, name, self.nodes_uri)
- entity = YamlFile(path).get_entity(name, self._default_environment)
+ entity = YamlData.from_file(path).get_entity(name, settings)
return entity
- def get_class(self, name, nodename=None):
+ def get_class(self, name, environment, settings):
vvv('GET CLASS {0}'.format(name))
try:
path = os.path.join(self.classes_uri, self._classes[name])
except KeyError, e:
raise reclass.errors.ClassNotFound(self.name, name, self.classes_uri)
- entity = YamlFile(path).get_entity(name)
+ entity = YamlData.from_file(path).get_entity(name, settings)
return entity
def enumerate_nodes(self):
diff --git a/reclass/storage/yaml_fs/yamlfile.py b/reclass/storage/yaml_fs/yamlfile.py
deleted file mode 100644
index 717a911..0000000
--- a/reclass/storage/yaml_fs/yamlfile.py
+++ /dev/null
@@ -1,61 +0,0 @@
-#
-# -*- coding: utf-8 -*-
-#
-# This file is part of reclass (http://github.com/madduck/reclass)
-#
-# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
-# Released under the terms of the Artistic Licence 2.0
-#
-from reclass import datatypes
-import yaml
-import os
-from reclass.errors import NotFoundError
-
-class YamlFile(object):
-
- def __init__(self, path):
- ''' Initialise a yamlfile object '''
- if not os.path.isfile(path):
- raise NotFoundError('No such file: %s' % path)
- if not os.access(path, os.R_OK):
- raise NotFoundError('Cannot open: %s' % path)
- self._path = path
- self._data = dict()
- self._read()
- path = property(lambda self: self._path)
-
- def _read(self):
- fp = file(self._path)
- data = yaml.safe_load(fp)
- if data is not None:
- self._data = data
- fp.close()
-
- def get_entity(self, name=None, default_environment=None):
- classes = self._data.get('classes')
- if classes is None:
- classes = []
- classes = datatypes.Classes(classes)
-
- applications = self._data.get('applications')
- if applications is None:
- applications = []
- applications = datatypes.Applications(applications)
-
- parameters = self._data.get('parameters')
- if parameters is None:
- parameters = {}
- parameters = datatypes.Parameters(parameters)
-
- env = self._data.get('environment', default_environment)
-
- if name is None:
- name = self._path
-
- return datatypes.Entity(classes, applications, parameters,
- name=name, environment=env,
- uri='yaml_fs://{0}'.format(self._path))
-
- def __repr__(self):
- return '<{0} {1}, {2}>'.format(self.__class__.__name__, self._path,
- self._data.keys())
diff --git a/reclass/storage/yaml_git/__init__.py b/reclass/storage/yaml_git/__init__.py
new file mode 100644
index 0000000..f4cb287
--- /dev/null
+++ b/reclass/storage/yaml_git/__init__.py
@@ -0,0 +1,267 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+
+import collections
+import distutils.version
+import fnmatch
+import os
+
+# Squelch warning on centos7 due to upgrading cffi
+# see https://github.com/saltstack/salt/pull/39871
+import warnings
+with warnings.catch_warnings():
+ warnings.simplefilter('ignore')
+ import pygit2
+
+import reclass.errors
+from reclass.storage import NodeStorageBase
+from reclass.storage.common import NameMangler
+from reclass.storage.yamldata import YamlData
+
+FILE_EXTENSION = '.yml'
+STORAGE_NAME = 'yaml_git'
+
+def path_mangler(inventory_base_uri, nodes_uri, classes_uri):
+ if nodes_uri == classes_uri:
+ raise errors.DuplicateUriError(nodes_uri, classes_uri)
+ return nodes_uri, classes_uri
+
+
+GitMD = collections.namedtuple('GitMD', ['name', 'path', 'id'], verbose=False, rename=False)
+
+
+class GitURI(object):
+
+ def __init__(self, dictionary):
+ self.repo = None
+ self.branch = None
+ self.root = None
+ self.cache_dir = None
+ self.pubkey = None
+ self.privkey = None
+ self.password = None
+ self.update(dictionary)
+
+ def update(self, dictionary):
+ if 'repo' in dictionary: self.repo = dictionary['repo']
+ if 'branch' in dictionary: self.branch = dictionary['branch']
+ if 'cache_dir' in dictionary: self.cache_dir = dictionary['cache_dir']
+ if 'pubkey' in dictionary: self.pubkey = dictionary['pubkey']
+ if 'privkey' in dictionary: self.privkey = dictionary['privkey']
+ if 'password' in dictionary: self.password = dictionary['password']
+ if 'root' in dictionary:
+ if dictionary['root'] is None:
+ self.root = None
+ else:
+ self.root = dictionary['root'].replace('/', '.')
+
+ def __repr__(self):
+ return '<{0}: {1} {2} {3}>'.format(self.__class__.__name__, self.repo, self.branch, self.root)
+
+
+class GitRepo(object):
+
+ def __init__(self, uri):
+ self.transport, _, self.url = uri.repo.partition('://')
+ self.name = self.url.replace('/', '_')
+ self.credentials = None
+ self.remotecallbacks = None
+ if uri.cache_dir is None:
+ self.cache_dir = '{0}/{1}/{2}'.format(os.path.expanduser("~"), '.reclass/cache/git', self.name)
+ else:
+ self.cache_dir = '{0}/{1}'.format(uri.cache_dir, self.name)
+
+ self._init_repo(uri)
+ self._fetch()
+ self.branches = self.repo.listall_branches()
+ self.files = self.files_in_repo()
+
+ def _init_repo(self, uri):
+ if os.path.exists(self.cache_dir):
+ self.repo = pygit2.Repository(self.cache_dir)
+ else:
+ os.makedirs(self.cache_dir)
+ self.repo = pygit2.init_repository(self.cache_dir, bare=True)
+
+ if not self.repo.remotes:
+ self.repo.create_remote('origin', self.url)
+
+ if 'ssh' in self.transport:
+ if '@' in self.url:
+ user, _, _ = self.url.partition('@')
+ else:
+ user = 'gitlab'
+
+ if uri.pubkey is not None:
+ creds = pygit2.Keypair(user, uri.pubkey, uri.privkey, uri.password)
+ else:
+ creds = pygit2.KeypairFromAgent(user)
+
+ pygit2_version = pygit2.__version__
+ if distutils.version.LooseVersion(pygit2_version) >= distutils.version.LooseVersion('0.23.2'):
+ self.remotecallbacks = pygit2.RemoteCallbacks(credentials=creds)
+ self.credentials = None
+ else:
+ self.remotecallbacks = None
+ self.credentials = creds
+
+ def _fetch(self):
+ origin = self.repo.remotes[0]
+ fetch_kwargs = {}
+ if self.remotecallbacks is not None:
+ fetch_kwargs['callbacks'] = self.remotecallbacks
+ if self.credentials is not None:
+ origin.credentials = self.credentials
+ fetch_results = origin.fetch(**fetch_kwargs)
+
+ remote_branches = self.repo.listall_branches(pygit2.GIT_BRANCH_REMOTE)
+ local_branches = self.repo.listall_branches()
+ for remote_branch_name in remote_branches:
+ _, _, local_branch_name = remote_branch_name.partition('/')
+ remote_branch = self.repo.lookup_branch(remote_branch_name, pygit2.GIT_BRANCH_REMOTE)
+ if local_branch_name not in local_branches:
+ local_branch = self.repo.create_branch(local_branch_name, self.repo[remote_branch.target.hex])
+ local_branch.upstream = remote_branch
+ else:
+ local_branch = self.repo.lookup_branch(local_branch_name)
+ if local_branch.target != remote_branch.target:
+ local_branch.set_target(remote_branch.target)
+
+ local_branches = self.repo.listall_branches()
+ for local_branch_name in local_branches:
+ remote_branch_name = '{0}/{1}'.format(origin.name, local_branch_name)
+ if remote_branch_name not in remote_branches:
+ local_branch = self.repo.lookup_branch(local_branch_name)
+ local.branch.delete()
+
+ def get(self, id):
+ return self.repo.get(id)
+
+ def files_in_tree(self, tree, path):
+ files = []
+ for entry in tree:
+ if entry.filemode == pygit2.GIT_FILEMODE_TREE:
+ subtree = self.repo.get(entry.id)
+ if path == '':
+ subpath = entry.name
+ else:
+ subpath = '/'.join([path, entry.name])
+ files.extend(self.files_in_tree(subtree, subpath))
+ else:
+ if path == '':
+ relpath = entry.name
+ else:
+ relpath = '/'.join([path, entry.name])
+ files.append(GitMD(entry.name, relpath, entry.id))
+ return files
+
+ def files_in_branch(self, branch):
+ tree = self.repo.revparse_single(branch).tree
+ return self.files_in_tree(tree, '')
+
+ def files_in_repo(self):
+ ret = {}
+ for bname in self.branches:
+ branch = {}
+ files = self.files_in_branch(bname)
+ for file in files:
+ if fnmatch.fnmatch(file.name, '*{0}'.format(FILE_EXTENSION)):
+ name = os.path.splitext(file.name)[0]
+ relpath = os.path.dirname(file.path)
+ relpath, name = NameMangler.classes(relpath, name)
+ if name in ret:
+ raise reclass.errors.DuplicateNodeNameError(self.name + ' - ' + bname, name, ret[name], path)
+ else:
+ branch[name] = file
+ ret[bname] = branch
+ return ret
+
+ def nodes(self, branch, subdir):
+ ret = {}
+ for name, file in self.files[branch].iteritems():
+ if subdir is None or name.startswith(subdir):
+ node_name = os.path.splitext(file.name)[0]
+ if node_name in ret:
+ raise reclass.errors.DuplicateNodeNameError(self.name, name, files[name], path)
+ else:
+ ret[node_name] = file
+ return ret
+
+class ExternalNodeStorage(NodeStorageBase):
+
+ def __init__(self, nodes_uri, classes_uri):
+ super(ExternalNodeStorage, self).__init__(STORAGE_NAME)
+ self._repos = dict()
+
+ if nodes_uri is not None:
+ self._nodes_uri = GitURI({ 'branch': 'master' })
+ self._nodes_uri.update(nodes_uri)
+ self._load_repo(self._nodes_uri)
+ self._nodes = self._repos[self._nodes_uri.repo].nodes(self._nodes_uri.branch, self._nodes_uri.root)
+
+ if classes_uri is not None:
+ self._classes_default_uri = GitURI({ 'branch': '__env__' })
+ self._classes_default_uri.update(classes_uri)
+ self._load_repo(self._classes_default_uri)
+
+ self._classes_uri = []
+ if 'env_overrides' in classes_uri:
+ for override in classes_uri['env_overrides']:
+ for env, options in override.iteritems():
+ uri = GitURI(self._classes_default_uri)
+ uri.update({ 'branch': env })
+ uri.update(options)
+ self._classes_uri.append((env, uri))
+ self._load_repo(uri)
+
+ self._classes_uri.append(('*', self._classes_default_uri))
+
+ nodes_uri = property(lambda self: self._nodes_uri)
+ classes_uri = property(lambda self: self._classes_uri)
+
+ def get_node(self, name, settings):
+ file = self._nodes[name]
+ blob = self._repos[self._nodes_uri.repo].get(file.id)
+ entity = YamlData.from_string(blob.data, 'git_fs://{0} {1} {2}'.format(self._nodes_uri.repo, self._nodes_uri.branch, file.path)).get_entity(name, settings)
+ return entity
+
+ def get_class(self, name, environment, settings):
+ uri = self._env_to_uri(environment)
+ if uri.root is not None:
+ name = '{0}.{1}'.format(uri.root, name)
+ if uri.repo not in self._repos:
+ raise reclass.errors.NotFoundError("Repo " + uri.repo + " unknown or missing")
+ if uri.branch not in self._repos[uri.repo].files:
+ raise reclass.errors.NotFoundError("Branch " + uri.branch + " missing from " + uri.repo)
+ if name not in self._repos[uri.repo].files[uri.branch]:
+ raise reclass.errors.NotFoundError("File " + name + " missing from " + uri.repo + " branch " + uri.branch)
+ file = self._repos[uri.repo].files[uri.branch][name]
+ blob = self._repos[uri.repo].get(file.id)
+ entity = YamlData.from_string(blob.data, 'git_fs://{0} {1} {2}'.format(uri.repo, uri.branch, file.path)).get_entity(name, settings)
+ return entity
+
+ def enumerate_nodes(self):
+ return self._nodes.keys()
+
+ def _load_repo(self, uri):
+ if uri.repo not in self._repos:
+ self._repos[uri.repo] = GitRepo(uri)
+
+ def _env_to_uri(self, environment):
+ ret = None
+ if environment is None:
+ ret = self._classes_default_uri
+ else:
+ for env, uri in self._classes_uri:
+ if env == environment:
+ ret = uri
+ break
+ if ret is None:
+ ret = self._classes_default_uri
+ if ret.branch == '__env__':
+ ret.branch = environment
+ if ret.branch == None:
+ ret.branch = 'master'
+ return ret
diff --git a/reclass/storage/yamldata.py b/reclass/storage/yamldata.py
new file mode 100644
index 0000000..0dda2b7
--- /dev/null
+++ b/reclass/storage/yamldata.py
@@ -0,0 +1,85 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass (http://github.com/madduck/reclass)
+#
+# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
+# Released under the terms of the Artistic Licence 2.0
+#
+from reclass import datatypes
+import yaml
+import os
+from reclass.errors import NotFoundError
+
+class YamlData(object):
+
+ @classmethod
+ def from_file(cls, path):
+ ''' Initialise yaml data from a local file '''
+ abs_path = os.path.abspath(path)
+ if not os.path.isfile(abs_path):
+ raise NotFoundError('No such file: %s' % abs_path)
+ if not os.access(abs_path, os.R_OK):
+ raise NotFoundError('Cannot open: %s' % abs_path)
+ y = cls('yaml_fs://{0}'.format(abs_path))
+ fp = file(abs_path)
+ data = yaml.safe_load(fp)
+ if data is not None:
+ y._data = data
+ fp.close()
+ return y
+
+ @classmethod
+ def from_string(cls, string, uri):
+ ''' Initialise yaml data from a string '''
+ y = cls(uri)
+ data = yaml.safe_load(string)
+ if data is not None:
+ y._data = data
+ return y
+
+ def __init__(self, uri):
+ self._uri = uri
+ self._data = dict()
+
+ uri = property(lambda self: self._uri)
+
+ def get_data(self):
+ return self._data
+
+ def get_entity(self, name, settings):
+ #if name is None:
+ # name = self._uri
+
+ classes = self._data.get('classes')
+ if classes is None:
+ classes = []
+ classes = datatypes.Classes(classes)
+
+ applications = self._data.get('applications')
+ if applications is None:
+ applications = []
+ applications = datatypes.Applications(applications)
+
+ parameters = self._data.get('parameters')
+ if parameters is None:
+ parameters = {}
+ parameters = datatypes.Parameters(parameters, settings, self._uri)
+
+ exports = self._data.get('exports')
+ if exports is None:
+ exports = {}
+ exports = datatypes.Exports(exports, settings, self._uri)
+
+ env = self._data.get('environment', None)
+
+ return datatypes.Entity(settings, classes=classes, applications=applications, parameters=parameters,
+ exports=exports, name=name, environment=env, uri=self.uri)
+
+ def __str__(self):
+ return '<{0} {1}, {2}>'.format(self.__class__.__name__, self._uri,
+ self._data)
+
+ def __repr__(self):
+ return '<{0} {1}, {2}>'.format(self.__class__.__name__, self._uri,
+ self._data.keys())
diff --git a/reclass/tests/__init__.py b/reclass/tests/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/reclass/tests/__init__.py
diff --git a/reclass/tests/data/01/classes/standard.yml b/reclass/tests/data/01/classes/standard.yml
new file mode 100644
index 0000000..13bce54
--- /dev/null
+++ b/reclass/tests/data/01/classes/standard.yml
@@ -0,0 +1,4 @@
+parameters:
+ int: 1
+ string: '1'
+ bool: True
diff --git a/reclass/tests/data/01/nodes/class_notfound.yml b/reclass/tests/data/01/nodes/class_notfound.yml
new file mode 100644
index 0000000..616a49b
--- /dev/null
+++ b/reclass/tests/data/01/nodes/class_notfound.yml
@@ -0,0 +1,5 @@
+classes:
+ - missing
+
+parameters:
+ node_test: class not found
diff --git a/reclass/tests/data/01/nodes/data_types.yml b/reclass/tests/data/01/nodes/data_types.yml
new file mode 100644
index 0000000..28ff151
--- /dev/null
+++ b/reclass/tests/data/01/nodes/data_types.yml
@@ -0,0 +1,2 @@
+classes:
+ - standard
diff --git a/reclass/tests/test_core.py b/reclass/tests/test_core.py
new file mode 100644
index 0000000..9225756
--- /dev/null
+++ b/reclass/tests/test_core.py
@@ -0,0 +1,60 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass (http://github.com/madduck/reclass)
+#
+
+import os
+
+from reclass import get_storage, get_path_mangler
+from reclass.core import Core
+from reclass.settings import Settings
+from reclass.errors import ClassNotFound
+
+import unittest
+try:
+ import unittest.mock as mock
+except ImportError:
+ import mock
+
+class TestCore(unittest.TestCase):
+
+ def _core(self, dataset, opts={}):
+ inventory_uri = os.path.dirname(os.path.abspath(__file__)) + '/data/' + dataset
+ path_mangler = get_path_mangler('yaml_fs')
+ nodes_uri, classes_uri = path_mangler(inventory_uri, 'nodes', 'classes')
+ storage = get_storage('yaml_fs', nodes_uri, classes_uri)
+ settings = Settings(opts)
+ return Core(storage, None, settings)
+
+ def test_type_conversion(self):
+ reclass = self._core('01')
+ node = reclass.nodeinfo('data_types')
+ params = { 'int': 1, 'bool': True, 'string': '1', '_reclass_': { 'environment': 'base', 'name': {'full': 'data_types', 'short': 'data_types' } } }
+ self.assertEqual(node['parameters'], params)
+
+ def test_raise_class_notfound(self):
+ reclass = self._core('01')
+ with self.assertRaises(ClassNotFound):
+ node = reclass.nodeinfo('class_notfound')
+
+ def test_ignore_class_notfound(self):
+ reclass = self._core('01', opts={ 'ignore_class_notfound': True, 'ignore_class_notfound_warning': False })
+ node = reclass.nodeinfo('class_notfound')
+ params = { 'node_test': 'class not found', '_reclass_': { 'environment': 'base', 'name': {'full': 'class_notfound', 'short': 'class_notfound' } } }
+ self.assertEqual(node['parameters'], params)
+
+ def test_raise_class_notfound_with_regexp(self):
+ reclass = self._core('01', opts={ 'ignore_class_notfound': True, 'ignore_class_notfound_warning': False, 'ignore_class_notfound_regexp': 'notmatched.*' })
+ with self.assertRaises(ClassNotFound):
+ node = reclass.nodeinfo('class_notfound')
+
+ def test_ignore_class_notfound_with_regexp(self):
+ reclass = self._core('01', opts={ 'ignore_class_notfound': True, 'ignore_class_notfound_warning': False, 'ignore_class_notfound_regexp': 'miss.*' })
+ node = reclass.nodeinfo('class_notfound')
+ params = { 'node_test': 'class not found', '_reclass_': { 'environment': 'base', 'name': {'full': 'class_notfound', 'short': 'class_notfound' } } }
+ self.assertEqual(node['parameters'], params)
+
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/reclass/utils/dictpath.py b/reclass/utils/dictpath.py
index db95e66..aec6722 100644
--- a/reclass/utils/dictpath.py
+++ b/reclass/utils/dictpath.py
@@ -59,12 +59,12 @@
if contents is None:
self._parts = []
else:
- if isinstance(contents, types.StringTypes):
+ if isinstance(contents, list):
+ self._parts = contents
+ elif isinstance(contents, types.StringTypes):
self._parts = self._split_string(contents)
elif isinstance(contents, tuple):
self._parts = list(contents)
- elif isinstance(contents, list):
- self._parts = contents
else:
raise TypeError('DictPath() takes string or list, '\
'not %s' % type(contents))
@@ -112,14 +112,59 @@
def _escape_string(self, string):
return string.replace(self._delim, '\\' + self._delim)
+ def has_ancestors(self):
+ return len(self._parts) > 1
+
+ def key_parts(self):
+ if self.has_ancestors():
+ return self._parts[:-1]
+ else:
+ return []
+
def new_subpath(self, key):
- try:
- return DictPath(self._delim, self._parts + [self._escape_string(key)])
- except AttributeError as e:
- return DictPath(self._delim, self._parts + [key])
+ return DictPath(self._delim, self._parts + [key])
def get_value(self, base):
return self._get_innermost_container(base)[self._get_key()]
def set_value(self, base, value):
self._get_innermost_container(base)[self._get_key()] = value
+
+ def drop_first(self):
+ del self._parts[0]
+ return self
+
+ def is_empty(self):
+ return len(self._parts) == 0
+
+ def delete(self, base):
+ del self._get_innermost_container(base)[self._get_key()]
+
+ def add_subpath(self, key):
+ self._parts.append(key)
+
+ def is_ancestor_of(self, other):
+ if len(other._parts) <= len(self._parts):
+ return False
+ for i in range(len(self._parts)):
+ if other._parts[i] != self._parts[i]:
+ return False
+ return True
+
+ def exists_in(self, container):
+ item = container
+ for i in self._parts:
+ if isinstance(item, (dict, list)):
+ if i in item:
+ if isinstance(item, dict):
+ item = item[i]
+ elif isinstance(container, list):
+ item = item[int(i)]
+ else:
+ return False
+ else:
+ if item == self._parts[-1]:
+ return True
+ else:
+ return False
+ return True
diff --git a/reclass/utils/refvalue.py b/reclass/utils/refvalue.py
deleted file mode 100644
index b8e730b..0000000
--- a/reclass/utils/refvalue.py
+++ /dev/null
@@ -1,115 +0,0 @@
-#
-# -*- coding: utf-8 -*-
-#
-# This file is part of reclass (http://github.com/madduck/reclass)
-#
-# Copyright © 2007–14 martin f. krafft <madduck@madduck.net>
-# Released under the terms of the Artistic Licence 2.0
-#
-
-import re
-
-from reclass.utils.dictpath import DictPath
-from reclass.defaults import PARAMETER_INTERPOLATION_SENTINELS, \
- PARAMETER_INTERPOLATION_DELIMITER
-from reclass.errors import IncompleteInterpolationError, \
- UndefinedVariableError
-
-_SENTINELS = [re.escape(s) for s in PARAMETER_INTERPOLATION_SENTINELS]
-_RE = '{0}\s*(.+?)\s*{1}'.format(*_SENTINELS)
-
-class RefValue(object):
- '''
- Isolates references in string values
-
- RefValue can be used to isolate and eventually expand references to other
- parameters in strings. Those references can then be iterated and rendered
- in the context of a dictionary to resolve those references.
-
- RefValue always gets constructed from a string, because templating
- — essentially this is what's going on — is necessarily always about
- strings. Therefore, generally, the rendered value of a RefValue instance
- will also be a string.
-
- Nevertheless, as this might not be desirable, RefValue will return the
- referenced variable without casting it to a string, if the templated
- string contains nothing but the reference itself.
-
- For instance:
-
- mydict = {'favcolour': 'yellow', 'answer': 42, 'list': [1,2,3]}
- RefValue('My favourite colour is ${favolour}').render(mydict)
- → 'My favourite colour is yellow' # a string
-
- RefValue('The answer is ${answer}').render(mydict)
- → 'The answer is 42' # a string
-
- RefValue('${answer}').render(mydict)
- → 42 # an int
-
- RefValue('${list}').render(mydict)
- → [1,2,3] # an list
-
- The markers used to identify references are set in reclass.defaults, as is
- the default delimiter.
- '''
-
- INTERPOLATION_RE = re.compile(_RE)
-
- def __init__(self, string, delim=PARAMETER_INTERPOLATION_DELIMITER):
- self._strings = []
- self._refs = []
- self._delim = delim
- self._parse(string)
-
- def _parse(self, string):
- parts = RefValue.INTERPOLATION_RE.split(string)
- self._refs = parts[1:][::2]
- self._strings = parts[0:][::2]
- self._check_strings(string)
-
- def _check_strings(self, orig):
- for s in self._strings:
- pos = s.find(PARAMETER_INTERPOLATION_SENTINELS[0])
- if pos >= 0:
- raise IncompleteInterpolationError(orig,
- PARAMETER_INTERPOLATION_SENTINELS[1])
-
- def _resolve(self, ref, context):
- path = DictPath(self._delim, ref)
- try:
- return path.get_value(context)
- except KeyError as e:
- raise UndefinedVariableError(ref)
-
- def has_references(self):
- return len(self._refs) > 0
-
- def get_references(self):
- return self._refs
-
- def _assemble(self, resolver):
- if not self.has_references():
- return self._strings[0]
-
- if self._strings == ['', '']:
- # preserve the type of the referenced variable
- return resolver(self._refs[0])
-
- # reassemble the string by taking a string and str(ref) pairwise
- ret = ''
- for i in range(0, len(self._refs)):
- ret += self._strings[i] + str(resolver(self._refs[i]))
- if len(self._strings) > len(self._refs):
- # and finally append a trailing string, if any
- ret += self._strings[-1]
- return ret
-
- def render(self, context):
- resolver = lambda s: self._resolve(s, context)
- return self._assemble(resolver)
-
- def __repr__(self):
- do_not_resolve = lambda s: s.join(PARAMETER_INTERPOLATION_SENTINELS)
- return 'RefValue(%r, %r)' % (self._assemble(do_not_resolve),
- self._delim)
diff --git a/reclass/values/__init__.py b/reclass/values/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/reclass/values/__init__.py
diff --git a/reclass/values/compitem.py b/reclass/values/compitem.py
new file mode 100644
index 0000000..2134ea8
--- /dev/null
+++ b/reclass/values/compitem.py
@@ -0,0 +1,54 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from reclass.settings import Settings
+from item import Item
+
+class CompItem(Item):
+
+ def __init__(self, items, settings):
+ self.type = Item.COMPOSITE
+ self._items = items
+ self._settings = settings
+ self._refs = []
+ self._allRefs = False
+ self.assembleRefs()
+
+ def assembleRefs(self, context={}):
+ self._refs = []
+ self._allRefs = True
+ for item in self._items:
+ if item.has_references():
+ item.assembleRefs(context)
+ self._refs.extend(item.get_references())
+ if item.allRefs() is False:
+ self._allRefs = False
+
+ def contents(self):
+ return self._items
+
+ def allRefs(self):
+ return self._allRefs
+
+ def has_references(self):
+ return len(self._refs) > 0
+
+ def get_references(self):
+ return self._refs
+
+ def render(self, context, inventory):
+ # Preserve type if only one item
+ if len(self._items) == 1:
+ return self._items[0].render(context, inventory)
+ # Multiple items
+ strings = [ str(i.render(context, inventory)) for i in self._items ]
+ return "".join(strings)
+
+ def __repr__(self):
+ return 'CompItem(%r)' % self._items
+
+ def __str__(self):
+ return ''.join([ str(i) for i in self._items ])
diff --git a/reclass/values/dictitem.py b/reclass/values/dictitem.py
new file mode 100644
index 0000000..d778fe2
--- /dev/null
+++ b/reclass/values/dictitem.py
@@ -0,0 +1,35 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from reclass.settings import Settings
+from item import Item
+
+class DictItem(Item):
+
+ def __init__(self, item, settings):
+ self.type = Item.DICTIONARY
+ self._dict = item
+ self._settings = settings
+
+ def contents(self):
+ return self._dict
+
+ def is_container(self):
+ return True
+
+ def merge_over(self, item):
+ if item.type == Item.SCALAR:
+ if item.contents() is None or self._settings.allow_dict_over_scalar:
+ return self
+ else:
+ raise TypeError('allow dict over scalar = False: cannot merge %s onto %s' % (repr(self), repr(item)))
+ raise TypeError('Cannot merge %s over %s' % (repr(self), repr(item)))
+
+ def render(self, context, inventory):
+ return self._dict
+
+ def __repr__(self):
+ return 'DictItem(%r)' % self._dict
diff --git a/reclass/values/invitem.py b/reclass/values/invitem.py
new file mode 100644
index 0000000..84ea39d
--- /dev/null
+++ b/reclass/values/invitem.py
@@ -0,0 +1,339 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+import copy
+import pyparsing as pp
+
+from item import Item
+from reclass.settings import Settings
+from reclass.utils.dictpath import DictPath
+from reclass.errors import ExpressionError, ParseError, ResolveError
+
+_OBJ = 'OBJ'
+_TEST = 'TEST'
+_LIST_TEST = 'LIST_TEST'
+_LOGICAL = 'LOGICAL'
+_OPTION = 'OPTION'
+
+_VALUE = 'VALUE'
+_IF = 'IF'
+_AND = 'AND'
+_OR = 'OR'
+
+_EQUAL = '=='
+_NOT_EQUAL = '!='
+
+_IGNORE_ERRORS = '+IgnoreErrors'
+_ALL_ENVS = '+AllEnvs'
+
+class Element(object):
+
+ def __init__(self, expression, delimiter):
+ self._delimiter = delimiter
+ self._export_path = None
+ self._parameter_path = None
+ self._parameter_value = None
+ self._export_path, self._parameter_path, self._parameter_value = self._get_vars(expression[0][1], self._export_path, self._parameter_path, self._parameter_value)
+ self._export_path, self._parameter_path, self._parameter_value = self._get_vars(expression[2][1], self._export_path, self._parameter_path, self._parameter_value)
+
+ try:
+ self._export_path.drop_first()
+ except AttributeError:
+ raise ExpressionError('No export')
+
+ self._inv_refs = [ self._export_path ]
+ self._test = expression[1][1]
+
+ if self._parameter_path is not None:
+ self._parameter_path.drop_first()
+ self._refs = [ str(self._parameter_path) ]
+ else:
+ self._refs = []
+
+ def refs(self):
+ return self._refs
+
+ def inv_refs(self):
+ return self._inv_refs
+
+ def value(self, context, items):
+ if self._parameter_path is not None:
+ self._parameter_value = self._resolve(self._parameter_path, context)
+
+ if self._parameter_value is None or self._test is None:
+ raise ExpressionError('Failed to render %s' % str(self), tbFlag=False)
+
+ if self._export_path.exists_in(items):
+ result = False
+ export_value = self._resolve(self._export_path, items)
+ if self._test == _EQUAL:
+ if export_value == self._parameter_value:
+ result = True
+ elif self._test == _NOT_EQUAL:
+ if export_value != self._parameter_value:
+ result = True
+ else:
+ raise ExpressionError('Unknown test {0}'.format(self._test), tbFlag=False)
+ return result
+ else:
+ return False
+
+ def _resolve(self, path, dictionary):
+ try:
+ return path.get_value(dictionary)
+ except KeyError as e:
+ raise ResolveError(str(path))
+
+ def _get_vars(self, var, export, parameter, value):
+ if isinstance(var, str):
+ path = DictPath(self._delimiter, var)
+ if path.path[0].lower() == 'exports':
+ export = path
+ elif path.path[0].lower() == 'self':
+ parameter = path
+ elif path.path[0].lower() == 'true':
+ value = True
+ elif path.path[0].lower() == 'false':
+ value = False
+ else:
+ value = var
+ else:
+ value = var
+ return export, parameter, value
+
+
+class Question(object):
+
+ def __init__(self, expression, delimiter):
+ self._elements = []
+ self._operators = []
+ self._delimiter = delimiter
+ self._refs = []
+ self._inv_refs = []
+ i = 0
+ while i < len(expression):
+ e = Element(expression[i:], self._delimiter)
+ self._elements.append(e)
+ self._refs.extend(e.refs())
+ self._inv_refs.extend(e.inv_refs())
+ i += 3
+ if i < len(expression):
+ self._operators.append(expression[i][1])
+ i += 1
+
+ def refs(self):
+ return self._refs
+
+ def inv_refs(self):
+ return self._inv_refs
+
+ def value(self, context, items):
+ if len(self._elements) == 0:
+ return True
+ elif len(self._elements) == 1:
+ return self._elements[0].value(context, items)
+ else:
+ result = self._elements[0].value(context, items)
+ for i in range(0, len(self._elements)-1):
+ next_result = self._elements[i+1].value(context, items)
+ if self._operators[i] == _AND:
+ result = result and next_result
+ elif self._operators[i] == _OR:
+ result = result or next_result
+ else:
+ raise ExpressionError('Unknown operator {0} {1}'.format(self._operators[i], self.elements), tbFlag=False)
+ return result
+
+
+class InvItem(Item):
+
+ def _get_parser():
+
+ def _object(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_OBJ, token)
+
+ def _integer(string, location, tokens):
+ try:
+ token = int(tokens[0])
+ except ValueError:
+ token = tokens[0]
+ tokens[0] = (_OBJ, token)
+
+ def _number(string, location, tokens):
+ try:
+ token = float(tokens[0])
+ except ValueError:
+ token = tokens[0]
+ tokens[0] = (_OBJ, token)
+
+ def _option(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_OPTION, token)
+
+ def _test(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_TEST, token)
+
+ def _logical(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_LOGICAL, token)
+
+ def _if(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_IF, token)
+
+ def _expr_var(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_VALUE, token)
+
+ def _expr_test(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_TEST, token)
+
+ def _expr_list_test(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (_LIST_TEST, token)
+
+ white_space = pp.White().suppress()
+ end = pp.StringEnd()
+ ignore_errors = pp.CaselessLiteral(_IGNORE_ERRORS)
+ all_envs = pp.CaselessLiteral(_ALL_ENVS)
+ option = (ignore_errors | all_envs).setParseAction(_option)
+ options = pp.Group(pp.ZeroOrMore(option + white_space))
+ operator_test = (pp.Literal(_EQUAL) | pp.Literal(_NOT_EQUAL)).setParseAction(_test)
+ operator_logical = (pp.CaselessLiteral(_AND) | pp.CaselessLiteral(_OR)).setParseAction(_logical)
+ begin_if = pp.CaselessLiteral(_IF, ).setParseAction(_if)
+ obj = pp.Word(pp.printables).setParseAction(_object)
+ integer = pp.Word('0123456789-').setParseAction(_integer)
+ number = pp.Word('0123456789-.').setParseAction(_number)
+ item = integer | number | obj
+ single_test = white_space + item + white_space + operator_test + white_space + item
+ additional_test = white_space + operator_logical + single_test
+ expr_var = pp.Group(obj + pp.Optional(white_space) + end).setParseAction(_expr_var)
+ expr_test = pp.Group(obj + white_space + begin_if + single_test + pp.ZeroOrMore(additional_test) + end).setParseAction(_expr_test)
+ expr_list_test = pp.Group(begin_if + single_test + pp.ZeroOrMore(additional_test) + end).setParseAction(_expr_list_test)
+ expr = (expr_test | expr_var | expr_list_test)
+ line = options + expr + end
+ return line
+
+ _parser = _get_parser()
+
+ def __init__(self, item, settings):
+ self.type = Item.INV_QUERY
+ self._settings = settings
+ self._needs_all_envs = False
+ self._ignore_failed_render = self._settings.inventory_ignore_failed_render
+ self._expr_text = item.render(None, None)
+ self._parse_expression(self._expr_text)
+
+ def _parse_expression(self, expr):
+ try:
+ tokens = InvItem._parser.parseString(expr).asList()
+ except pp.ParseException as e:
+ raise ParseError(e.msg, e.line, e.col, e.lineno)
+
+ if len(tokens) == 1:
+ self._expr_type = tokens[0][0]
+ self._expr = list(tokens[0][1])
+ elif len(tokens) == 2:
+ for opt in tokens[0]:
+ if opt[1] == _IGNORE_ERRORS:
+ self._ignore_failed_render = True
+ elif opt[1] == _ALL_ENVS:
+ self._needs_all_envs = True
+ self._expr_type = tokens[1][0]
+ self._expr = list(tokens[1][1])
+ else:
+ raise ExpressionError('Failed to parse %s' % str(tokens), tbFlag=False)
+
+ if self._expr_type == _VALUE:
+ self._value_path = DictPath(self._settings.delimiter, self._expr[0][1]).drop_first()
+ self._question = Question([], self._settings.delimiter)
+ self._refs = []
+ self._inv_refs = [ self._value_path ]
+ elif self._expr_type == _TEST:
+ self._value_path = DictPath(self._settings.delimiter, self._expr[0][1]).drop_first()
+ self._question = Question(self._expr[2:], self._settings.delimiter)
+ self._refs = self._question.refs()
+ self._inv_refs = self._question.inv_refs()
+ self._inv_refs.append(self._value_path)
+ elif self._expr_type == _LIST_TEST:
+ self._value_path = None
+ self._question = Question(self._expr[1:], self._settings.delimiter)
+ self._refs = self._question.refs()
+ self._inv_refs = self._question.inv_refs()
+ else:
+ raise ExpressionError('Unknown expression type: %s' % self._expr_type, tbFlag=False)
+
+ def assembleRefs(self, context):
+ return
+
+ def contents(self):
+ return self._expr_text
+
+ def has_inv_query(self):
+ return True
+
+ def has_references(self):
+ return len(self._question.refs()) > 0
+
+ def get_references(self):
+ return self._question.refs()
+
+ def get_inv_references(self):
+ return self._inv_refs
+
+ def needs_all_envs(self):
+ return self._needs_all_envs
+
+ def ignore_failed_render(self):
+ return self._ignore_failed_render
+
+ def _resolve(self, path, dictionary):
+ try:
+ return path.get_value(dictionary)
+ except KeyError as e:
+ raise ResolveError(str(path))
+
+ def _value_expression(self, inventory):
+ results = {}
+ for node, items in inventory.iteritems():
+ if self._value_path.exists_in(items):
+ results[node] = copy.deepcopy(self._resolve(self._value_path, items))
+ return results
+
+ def _test_expression(self, context, inventory):
+ if self._value_path is None:
+ ExpressionError('Failed to render %s' % str(self), tbFlag=False)
+
+ results = {}
+ for node, items in inventory.iteritems():
+ if self._question.value(context, items) and self._value_path.exists_in(items):
+ results[node] = copy.deepcopy(self._resolve(self._value_path, items))
+ return results
+
+ def _list_test_expression(self, context, inventory):
+ results = []
+ for node, items in inventory.iteritems():
+ if self._question.value(context, items):
+ results.append(node)
+ return results
+
+ def render(self, context, inventory):
+ if self._expr_type == _VALUE:
+ return self._value_expression(inventory)
+ elif self._expr_type == _TEST:
+ return self._test_expression(context, inventory)
+ elif self._expr_type == _LIST_TEST:
+ return self._list_test_expression(context, inventory)
+ raise ExpressionError('Failed to render %s' % str(self), tbFlag=False)
+
+ def __str__(self):
+ return ' '.join(str(j) for i,j in self._expr)
+
+ def __repr__(self):
+ return 'InvItem(%r)' % self._expr
diff --git a/reclass/values/item.py b/reclass/values/item.py
new file mode 100644
index 0000000..57fd0e3
--- /dev/null
+++ b/reclass/values/item.py
@@ -0,0 +1,43 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from reclass.utils.dictpath import DictPath
+
+class Item(object):
+
+ COMPOSITE = 1
+ DICTIONARY = 2
+ INV_QUERY = 3
+ LIST = 4
+ REFERENCE = 5
+ SCALAR = 6
+
+ def allRefs(self):
+ return True
+
+ def has_references(self):
+ return False
+
+ def has_inv_query(self):
+ return False
+
+ def is_container(self):
+ return False
+
+ def is_complex(self):
+ return (self.has_references() | self.has_inv_query())
+
+ def contents(self):
+ msg = "Item class {0} does not implement contents()"
+ raise NotImplementedError(msg.format(self.__class__.__name__))
+
+ def merge_over(self, item):
+ msg = "Item class {0} does not implement merge_over()"
+ raise NotImplementedError(msg.format(self.__class__.__name__))
+
+ def render(self, context, exports):
+ msg = "Item class {0} does not implement render()"
+ raise NotImplementedError(msg.format(self.__class__.__name__))
diff --git a/reclass/values/listitem.py b/reclass/values/listitem.py
new file mode 100644
index 0000000..c7f29d0
--- /dev/null
+++ b/reclass/values/listitem.py
@@ -0,0 +1,41 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from item import Item
+from reclass.settings import Settings
+
+class ListItem(Item):
+
+ def __init__(self, item, settings):
+ self.type = Item.LIST
+ self._list = item
+ self._settings = settings
+
+ def contents(self):
+ return self._list
+
+ def is_container(self):
+ return True
+
+ def render(self, context, inventory):
+ return self._list
+
+ def merge_over(self, item):
+ if item.type == Item.LIST:
+ item._list.extend(self._list)
+ return item
+ elif item.type == Item.SCALAR:
+ if item.contents() is None:
+ return self
+ elif self._settings.allow_list_over_scalar:
+ self._list.insert(0, item.contents())
+ return self
+ else:
+ raise TypeError('allow list over scalar = False: cannot merge %s onto %s' % (repr(self), repr(item)))
+ raise TypeError('Cannot merge %s over %s' % (repr(self), repr(item)))
+
+ def __repr__(self):
+ return 'ListItem(%r)' % (self._list)
diff --git a/reclass/values/parser.py b/reclass/values/parser.py
new file mode 100644
index 0000000..bdd881d
--- /dev/null
+++ b/reclass/values/parser.py
@@ -0,0 +1,64 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+import pyparsing as pp
+
+from compitem import CompItem
+from invitem import InvItem
+from refitem import RefItem
+from scaitem import ScaItem
+
+from reclass.errors import ParseError
+from reclass.values.parser_funcs import STR, REF, INV
+
+class Parser(object):
+
+ def parse(self, value, settings):
+ self._settings = settings
+ dollars = value.count('$')
+ if dollars == 0:
+ # speed up: only use pyparsing if there is a $ in the string
+ return ScaItem(value, self._settings)
+ elif dollars == 1:
+ # speed up: try a simple reference
+ try:
+ tokens = self._settings.simple_ref_parser.leaveWhitespace().parseString(value).asList()
+ except pp.ParseException:
+ # fall back on the full parser
+ try:
+ tokens = self._settings.ref_parser.leaveWhitespace().parseString(value).asList()
+ except pp.ParseException as e:
+ raise ParseError(e.msg, e.line, e.col, e.lineno)
+ else:
+ # use the full parser
+ try:
+ tokens = self._settings.ref_parser.leaveWhitespace().parseString(value).asList()
+ except pp.ParseException as e:
+ raise ParseError(e.msg, e.line, e.col, e.lineno)
+
+ items = self._create_items(tokens)
+ if len(items) == 1:
+ return items[0]
+ else:
+ return CompItem(items, self._settings)
+
+ _create_dict = { STR: (lambda s, v: ScaItem(v, s._settings)),
+ REF: (lambda s, v: s._create_ref(v)),
+ INV: (lambda s, v: s._create_inv(v)) }
+
+ def _create_items(self, tokens):
+ return [ self._create_dict[t](self, v) for t, v in tokens ]
+
+ def _create_ref(self, tokens):
+ items = [ self._create_dict[t](self, v) for t, v in tokens ]
+ return RefItem(items, self._settings)
+
+ def _create_inv(self, tokens):
+ items = [ ScaItem(v, self._settings) for t, v in tokens ]
+ if len(items) == 1:
+ return InvItem(items[0], self._settings)
+ else:
+ return InvItem(CompItem(items), self._settings)
diff --git a/reclass/values/parser_funcs.py b/reclass/values/parser_funcs.py
new file mode 100644
index 0000000..bd5a1ba
--- /dev/null
+++ b/reclass/values/parser_funcs.py
@@ -0,0 +1,99 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+import pyparsing as pp
+
+STR = 1
+REF = 2
+INV = 3
+
+def _string(string, location, tokens):
+ token = tokens[0]
+ tokens[0] = (STR, token)
+
+def _reference(string, location, tokens):
+ token = list(tokens[0])
+ tokens[0] = (REF, token)
+
+def _invquery(string, location, tokens):
+ token = list(tokens[0])
+ tokens[0] = (INV, token)
+
+def get_ref_parser(escape_character, reference_sentinels, export_sentinels):
+ _ESCAPE = escape_character
+ _DOUBLE_ESCAPE = _ESCAPE + _ESCAPE
+
+ _REF_OPEN = reference_sentinels[0]
+ _REF_CLOSE = reference_sentinels[1]
+ _REF_CLOSE_FIRST = _REF_CLOSE[0]
+ _REF_ESCAPE_OPEN = _ESCAPE + _REF_OPEN
+ _REF_ESCAPE_CLOSE = _ESCAPE + _REF_CLOSE
+ _REF_DOUBLE_ESCAPE_OPEN = _DOUBLE_ESCAPE + _REF_OPEN
+ _REF_DOUBLE_ESCAPE_CLOSE = _DOUBLE_ESCAPE + _REF_CLOSE
+ _REF_EXCLUDES = _ESCAPE + _REF_OPEN + _REF_CLOSE
+
+ _INV_OPEN = export_sentinels[0]
+ _INV_CLOSE = export_sentinels[1]
+ _INV_CLOSE_FIRST = _INV_CLOSE[0]
+ _INV_ESCAPE_OPEN = _ESCAPE + _INV_OPEN
+ _INV_ESCAPE_CLOSE = _ESCAPE + _INV_CLOSE
+ _INV_DOUBLE_ESCAPE_OPEN = _DOUBLE_ESCAPE + _INV_OPEN
+ _INV_DOUBLE_ESCAPE_CLOSE = _DOUBLE_ESCAPE + _INV_CLOSE
+ _INV_EXCLUDES = _ESCAPE + _INV_OPEN + _INV_CLOSE
+
+ _EXCLUDES = _ESCAPE + _REF_OPEN + _REF_CLOSE + _INV_OPEN + _INV_CLOSE
+
+ double_escape = pp.Combine(pp.Literal(_DOUBLE_ESCAPE) + pp.MatchFirst([pp.FollowedBy(_REF_OPEN), pp.FollowedBy(_REF_CLOSE),
+ pp.FollowedBy(_INV_OPEN), pp.FollowedBy(_INV_CLOSE)])).setParseAction(pp.replaceWith(_ESCAPE))
+
+ ref_open = pp.Literal(_REF_OPEN).suppress()
+ ref_close = pp.Literal(_REF_CLOSE).suppress()
+ ref_not_open = ~pp.Literal(_REF_OPEN) + ~pp.Literal(_REF_ESCAPE_OPEN) + ~pp.Literal(_REF_DOUBLE_ESCAPE_OPEN)
+ ref_not_close = ~pp.Literal(_REF_CLOSE) + ~pp.Literal(_REF_ESCAPE_CLOSE) + ~pp.Literal(_REF_DOUBLE_ESCAPE_CLOSE)
+ ref_escape_open = pp.Literal(_REF_ESCAPE_OPEN).setParseAction(pp.replaceWith(_REF_OPEN))
+ ref_escape_close = pp.Literal(_REF_ESCAPE_CLOSE).setParseAction(pp.replaceWith(_REF_CLOSE))
+ ref_text = pp.CharsNotIn(_REF_EXCLUDES) | pp.CharsNotIn(_REF_CLOSE_FIRST, exact=1)
+ ref_content = pp.Combine(pp.OneOrMore(ref_not_open + ref_not_close + ref_text))
+ ref_string = pp.MatchFirst([double_escape, ref_escape_open, ref_escape_close, ref_content]).setParseAction(_string)
+ ref_item = pp.Forward()
+ ref_items = pp.OneOrMore(ref_item)
+ reference = (ref_open + pp.Group(ref_items) + ref_close).setParseAction(_reference)
+ ref_item << (reference | ref_string)
+
+ inv_open = pp.Literal(_INV_OPEN).suppress()
+ inv_close = pp.Literal(_INV_CLOSE).suppress()
+ inv_not_open = ~pp.Literal(_INV_OPEN) + ~pp.Literal(_INV_ESCAPE_OPEN) + ~pp.Literal(_INV_DOUBLE_ESCAPE_OPEN)
+ inv_not_close = ~pp.Literal(_INV_CLOSE) + ~pp.Literal(_INV_ESCAPE_CLOSE) + ~pp.Literal(_INV_DOUBLE_ESCAPE_CLOSE)
+ inv_escape_open = pp.Literal(_INV_ESCAPE_OPEN).setParseAction(pp.replaceWith(_INV_OPEN))
+ inv_escape_close = pp.Literal(_INV_ESCAPE_CLOSE).setParseAction(pp.replaceWith(_INV_CLOSE))
+ inv_text = pp.CharsNotIn(_INV_CLOSE_FIRST)
+ inv_content = pp.Combine(pp.OneOrMore(inv_not_close + inv_text))
+ inv_string = pp.MatchFirst([double_escape, inv_escape_open, inv_escape_close, inv_content]).setParseAction(_string)
+ inv_items = pp.OneOrMore(inv_string)
+ export = (inv_open + pp.Group(inv_items) + inv_close).setParseAction(_invquery)
+
+ text = pp.CharsNotIn(_EXCLUDES) | pp.CharsNotIn('', exact=1)
+ content = pp.Combine(pp.OneOrMore(ref_not_open + inv_not_open + text))
+ string = pp.MatchFirst([double_escape, ref_escape_open, inv_escape_open, content]).setParseAction(_string)
+
+ item = reference | export | string
+ line = pp.OneOrMore(item) + pp.StringEnd()
+ return line
+
+def get_simple_ref_parser(escape_character, reference_sentinels, export_sentinels):
+ _ESCAPE = escape_character
+ _REF_OPEN = reference_sentinels[0]
+ _REF_CLOSE = reference_sentinels[1]
+ _INV_OPEN = export_sentinels[0]
+ _INV_CLOSE = export_sentinels[1]
+ _EXCLUDES = _ESCAPE + _REF_OPEN + _REF_CLOSE + _INV_OPEN + _INV_CLOSE
+
+ string = pp.CharsNotIn(_EXCLUDES).setParseAction(_string)
+ ref_open = pp.Literal(_REF_OPEN).suppress()
+ ref_close = pp.Literal(_REF_CLOSE).suppress()
+ reference = (ref_open + pp.Group(string) + ref_close).setParseAction(_reference)
+ line = pp.StringStart() + pp.Optional(string) + reference + pp.Optional(string) + pp.StringEnd()
+ return line
diff --git a/reclass/values/refitem.py b/reclass/values/refitem.py
new file mode 100644
index 0000000..0ae65e6
--- /dev/null
+++ b/reclass/values/refitem.py
@@ -0,0 +1,70 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from item import Item
+from reclass.defaults import REFERENCE_SENTINELS
+from reclass.settings import Settings
+from reclass.utils.dictpath import DictPath
+from reclass.errors import ResolveError
+
+
+class RefItem(Item):
+
+ def __init__(self, items, settings):
+ self.type = Item.REFERENCE
+ self._settings = settings
+ self._items = items
+ self._refs = []
+ self._allRefs = False
+ self.assembleRefs()
+
+ def assembleRefs(self, context={}):
+ self._refs = []
+ self._allRefs = True
+ for item in self._items:
+ if item.has_references():
+ item.assembleRefs(context)
+ self._refs.extend(item.get_references())
+ if item.allRefs() == False:
+ self._allRefs = False
+ try:
+ strings = [ str(i.render(context, None)) for i in self._items ]
+ value = "".join(strings)
+ self._refs.append(value)
+ except ResolveError as e:
+ self._allRefs = False
+
+ def contents(self):
+ return self._items
+
+ def allRefs(self):
+ return self._allRefs
+
+ def has_references(self):
+ return len(self._refs) > 0
+
+ def get_references(self):
+ return self._refs
+
+ def _resolve(self, ref, context):
+ path = DictPath(self._settings.delimiter, ref)
+ try:
+ return path.get_value(context)
+ except (KeyError, TypeError) as e:
+ raise ResolveError(ref)
+
+ def render(self, context, inventory):
+ if len(self._items) == 1:
+ return self._resolve(self._items[0].render(context, inventory), context)
+ strings = [ str(i.render(context, inventory)) for i in self._items ]
+ return self._resolve("".join(strings), context)
+
+ def __repr__(self):
+ return 'RefItem(%r)' % self._items
+
+ def __str__(self):
+ strings = [ str(i) for i in self._items ]
+ return '{0}{1}{2}'.format(REFERENCE_SENTINELS[0], ''.join(strings), REFERENCE_SENTINELS[1])
diff --git a/reclass/values/scaitem.py b/reclass/values/scaitem.py
new file mode 100644
index 0000000..466d3c9
--- /dev/null
+++ b/reclass/values/scaitem.py
@@ -0,0 +1,42 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from reclass.settings import Settings
+from item import Item
+
+class ScaItem(Item):
+
+ def __init__(self, value, settings):
+ self.type = Item.SCALAR
+ self._value = value
+ self._settings = settings
+
+ def contents(self):
+ return self._value
+
+ def merge_over(self, item):
+ if item.type == Item.SCALAR:
+ return self
+ elif item.type == Item.LIST:
+ if self._settings.allow_scalar_over_list or (self._settings.allow_none_override and self._value in [None, 'none', 'None']):
+ return self
+ else:
+ raise TypeError('allow scalar over list = False: cannot merge %s over %s' % (repr(self), repr(item)))
+ elif item.type == Item.DICTIONARY:
+ if self._settings.allow_scalar_over_dict or (self._settings.allow_none_override and self._value in [None, 'none', 'None']):
+ return self
+ else:
+ raise TypeError('allow scalar over dict = False: cannot merge %s over %s' % (repr(self), repr(item)))
+ raise TypeError('Cannot merge %s over %s' % (repr(self), repr(item)))
+
+ def render(self, context, inventory):
+ return self._value
+
+ def __repr__(self):
+ return 'ScaItem({0!r})'.format(self._value)
+
+ def __str__(self):
+ return str(self._value)
diff --git a/reclass/values/tests/__init__.py b/reclass/values/tests/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/reclass/values/tests/__init__.py
diff --git a/reclass/utils/tests/test_refvalue.py b/reclass/values/tests/test_value.py
similarity index 65%
rename from reclass/utils/tests/test_refvalue.py
rename to reclass/values/tests/test_value.py
index 23d7e7b..84403d3 100644
--- a/reclass/utils/tests/test_refvalue.py
+++ b/reclass/values/tests/test_value.py
@@ -7,16 +7,18 @@
# Released under the terms of the Artistic Licence 2.0
#
-from reclass.utils.refvalue import RefValue
-from reclass.defaults import PARAMETER_INTERPOLATION_SENTINELS, \
- PARAMETER_INTERPOLATION_DELIMITER
-from reclass.errors import UndefinedVariableError, \
- IncompleteInterpolationError
+import pyparsing as pp
+
+from reclass.settings import Settings
+from reclass.values.value import Value
+from reclass.errors import ResolveError, ParseError
import unittest
+SETTINGS = Settings()
+
def _var(s):
- return '%s%s%s' % (PARAMETER_INTERPOLATION_SENTINELS[0], s,
- PARAMETER_INTERPOLATION_SENTINELS[1])
+ return '%s%s%s' % (SETTINGS.reference_sentinels[0], s,
+ SETTINGS.reference_sentinels[1])
CONTEXT = {'favcolour':'yellow',
'motd':{'greeting':'Servus!',
@@ -31,18 +33,18 @@
def _poor_mans_template(s, var, value):
return s.replace(_var(var), value)
-class TestRefValue(unittest.TestCase):
+class TestValue(unittest.TestCase):
def test_simple_string(self):
s = 'my cat likes to hide in boxes'
- tv = RefValue(s)
+ tv = Value(s, SETTINGS, '')
self.assertFalse(tv.has_references())
- self.assertEquals(tv.render(CONTEXT), s)
+ self.assertEquals(tv.render(CONTEXT, None), s)
def _test_solo_ref(self, key):
s = _var(key)
- tv = RefValue(s)
- res = tv.render(CONTEXT)
+ tv = Value(s, SETTINGS, '')
+ res = tv.render(CONTEXT, None)
self.assertTrue(tv.has_references())
self.assertEqual(res, CONTEXT[key])
@@ -63,65 +65,65 @@
def test_single_subst_bothends(self):
s = 'I like ' + _var('favcolour') + ' and I like it'
- tv = RefValue(s)
+ tv = Value(s, SETTINGS, '')
self.assertTrue(tv.has_references())
- self.assertEqual(tv.render(CONTEXT),
+ self.assertEqual(tv.render(CONTEXT, None),
_poor_mans_template(s, 'favcolour',
CONTEXT['favcolour']))
def test_single_subst_start(self):
s = _var('favcolour') + ' is my favourite colour'
- tv = RefValue(s)
+ tv = Value(s, SETTINGS, '')
self.assertTrue(tv.has_references())
- self.assertEqual(tv.render(CONTEXT),
+ self.assertEqual(tv.render(CONTEXT, None),
_poor_mans_template(s, 'favcolour',
CONTEXT['favcolour']))
def test_single_subst_end(self):
s = 'I like ' + _var('favcolour')
- tv = RefValue(s)
+ tv = Value(s, SETTINGS, '')
self.assertTrue(tv.has_references())
- self.assertEqual(tv.render(CONTEXT),
+ self.assertEqual(tv.render(CONTEXT, None),
_poor_mans_template(s, 'favcolour',
CONTEXT['favcolour']))
def test_deep_subst_solo(self):
- var = PARAMETER_INTERPOLATION_DELIMITER.join(('motd', 'greeting'))
- s = _var(var)
- tv = RefValue(s)
+ motd = SETTINGS.delimiter.join(('motd', 'greeting'))
+ s = _var(motd)
+ tv = Value(s, SETTINGS, '')
self.assertTrue(tv.has_references())
- self.assertEqual(tv.render(CONTEXT),
- _poor_mans_template(s, var,
+ self.assertEqual(tv.render(CONTEXT, None),
+ _poor_mans_template(s, motd,
CONTEXT['motd']['greeting']))
def test_multiple_subst(self):
- greet = PARAMETER_INTERPOLATION_DELIMITER.join(('motd', 'greeting'))
+ greet = SETTINGS.delimiter.join(('motd', 'greeting'))
s = _var(greet) + ' I like ' + _var('favcolour') + '!'
- tv = RefValue(s)
+ tv = Value(s, SETTINGS, '')
self.assertTrue(tv.has_references())
want = _poor_mans_template(s, greet, CONTEXT['motd']['greeting'])
want = _poor_mans_template(want, 'favcolour', CONTEXT['favcolour'])
- self.assertEqual(tv.render(CONTEXT), want)
+ self.assertEqual(tv.render(CONTEXT, None), want)
def test_multiple_subst_flush(self):
- greet = PARAMETER_INTERPOLATION_DELIMITER.join(('motd', 'greeting'))
+ greet = SETTINGS.delimiter.join(('motd', 'greeting'))
s = _var(greet) + ' I like ' + _var('favcolour')
- tv = RefValue(s)
+ tv = Value(s, SETTINGS, '')
self.assertTrue(tv.has_references())
want = _poor_mans_template(s, greet, CONTEXT['motd']['greeting'])
want = _poor_mans_template(want, 'favcolour', CONTEXT['favcolour'])
- self.assertEqual(tv.render(CONTEXT), want)
+ self.assertEqual(tv.render(CONTEXT, None), want)
def test_undefined_variable(self):
s = _var('no_such_variable')
- tv = RefValue(s)
- with self.assertRaises(UndefinedVariableError):
- tv.render(CONTEXT)
+ tv = Value(s, SETTINGS, '')
+ with self.assertRaises(ResolveError):
+ tv.render(CONTEXT, None)
def test_incomplete_variable(self):
- s = PARAMETER_INTERPOLATION_SENTINELS[0] + 'incomplete'
- with self.assertRaises(IncompleteInterpolationError):
- tv = RefValue(s)
+ s = SETTINGS.reference_sentinels[0] + 'incomplete'
+ with self.assertRaises(ParseError):
+ tv = Value(s, SETTINGS, '')
if __name__ == '__main__':
unittest.main()
diff --git a/reclass/values/value.py b/reclass/values/value.py
new file mode 100644
index 0000000..4ec6051
--- /dev/null
+++ b/reclass/values/value.py
@@ -0,0 +1,88 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+from parser import Parser
+from dictitem import DictItem
+from listitem import ListItem
+from scaitem import ScaItem
+from reclass.errors import InterpolationError
+
+class Value(object):
+
+ _parser = Parser()
+
+ def __init__(self, value, settings, uri):
+ self._settings = settings
+ self._uri = uri
+ if isinstance(value, str):
+ try:
+ self._item = self._parser.parse(value, self._settings)
+ except InterpolationError as e:
+ e.uri = self._uri
+ raise
+ elif isinstance(value, list):
+ self._item = ListItem(value, self._settings)
+ elif isinstance(value, dict):
+ self._item = DictItem(value, self._settings)
+ else:
+ self._item = ScaItem(value, self._settings)
+
+ def uri(self):
+ return self._uri
+
+ def is_container(self):
+ return self._item.is_container()
+
+ def allRefs(self):
+ return self._item.allRefs()
+
+ def has_references(self):
+ return self._item.has_references()
+
+ def has_inv_query(self):
+ return self._item.has_inv_query()
+
+ def needs_all_envs(self):
+ if self._item.has_inv_query():
+ return self._item.needs_all_envs()
+ else:
+ return False
+
+ def ignore_failed_render(self):
+ return self._item.ignore_failed_render()
+
+ def is_complex(self):
+ return self._item.is_complex()
+
+ def get_references(self):
+ return self._item.get_references()
+
+ def get_inv_references(self):
+ return self._item.get_inv_references()
+
+ def assembleRefs(self, context):
+ if self._item.has_references():
+ self._item.assembleRefs(context)
+
+ def render(self, context, inventory):
+ try:
+ return self._item.render(context, inventory)
+ except InterpolationError as e:
+ e.uri = self._uri
+ raise
+
+ def contents(self):
+ return self._item.contents()
+
+ def merge_over(self, value):
+ self._item = self._item.merge_over(value._item)
+ return self
+
+ def __repr__(self):
+ return 'Value(%r)' % self._item
+
+ def __str__(self):
+ return str(self._item)
diff --git a/reclass/values/valuelist.py b/reclass/values/valuelist.py
new file mode 100644
index 0000000..46d8ec7
--- /dev/null
+++ b/reclass/values/valuelist.py
@@ -0,0 +1,132 @@
+#
+# -*- coding: utf-8 -*-
+#
+# This file is part of reclass
+#
+
+import copy
+import sys
+
+from reclass.errors import ResolveError
+
+class ValueList(object):
+
+ def __init__(self, value, settings):
+ self._settings = settings
+ self._refs = []
+ self._allRefs = True
+ self._values = [ value ]
+ self._inv_refs = []
+ self._has_inv_query = False
+ self._ignore_failed_render = False
+ self._update()
+
+ def append(self, value):
+ self._values.append(value)
+ self._update()
+
+ def extend(self, values):
+ self._values.extend(values._values)
+ self._update()
+
+ def _update(self):
+ self.assembleRefs()
+ self._check_for_inv_query()
+
+ def has_references(self):
+ return len(self._refs) > 0
+
+ def has_inv_query(self):
+ return self._has_inv_query
+
+ def get_inv_references(self):
+ return self._inv_refs
+
+ def is_complex(self):
+ return (self.has_references() | self.has_inv_query())
+
+ def get_references(self):
+ return self._refs
+
+ def allRefs(self):
+ return self._allRefs
+
+ def ignore_failed_render(self):
+ return self._ignore_failed_render
+
+ def _check_for_inv_query(self):
+ self._has_inv_query = False
+ self._ignore_failed_render = True
+ for value in self._values:
+ if value.has_inv_query():
+ self._inv_refs.extend(value.get_inv_references)
+ self._has_inv_query = True
+ if vale.ignore_failed_render() is False:
+ self._ignore_failed_render = False
+ if self._has_inv_query is False:
+ self._ignore_failed_render = False
+
+ def assembleRefs(self, context={}):
+ self._refs = []
+ self._allRefs = True
+ for value in self._values:
+ value.assembleRefs(context)
+ if value.has_references():
+ self._refs.extend(value.get_references())
+ if value.allRefs() is False:
+ self._allRefs = False
+
+ def merge(self):
+ output = None
+ for n, value in enumerate(self._values):
+ if output is None:
+ output = value
+ else:
+ output = value.merge_over(output)
+ return output
+
+ def render(self, context, inventory):
+ from reclass.datatypes.parameters import Parameters
+
+ output = None
+ deepCopied = False
+ last_error = None
+ for n, value in enumerate(self._values):
+ try:
+ new = value.render(context, inventory)
+ except ResolveError as e:
+ if self._settings.ignore_overwritten_missing_references and not isinstance(output, (dict, list)) and n != (len(self._values)-1):
+ new = None
+ last_error = e
+ print >>sys.stderr, "[WARNING] Reference '%s' undefined" % (str(value))
+ else:
+ raise e
+
+ if output is None:
+ output = new
+ deepCopied = False
+ else:
+ if isinstance(output, dict) and isinstance(new, dict):
+ p1 = Parameters(output, self._settings, None, merge_initialise = False)
+ p2 = Parameters(new, self._settings, None, merge_initialise = False)
+ p1.merge(p2, wrap=False)
+ output = p1.as_dict()
+ continue
+ elif isinstance(output, list) and isinstance(new, list):
+ if not deepCopied:
+ output = copy.deepcopy(output)
+ deepCopied = True
+ output.extend(new)
+ continue
+ elif isinstance(output, (dict, list)) or isinstance(new, (dict, list)):
+ raise TypeError('Cannot merge %s over %s' % (repr(self._values[n]), repr(self._values[n-1])))
+ else:
+ output = new
+
+ if isinstance(output, (dict, list)) and last_error is not None:
+ raise last_error
+
+ return output
+
+ def __repr__(self):
+ return 'ValueList(%r)' % self._values
diff --git a/reclass/version.py b/reclass/version.py
index a2aa99a..90c2cb7 100644
--- a/reclass/version.py
+++ b/reclass/version.py
@@ -7,12 +7,12 @@
# Released under the terms of the Artistic Licence 2.0
#
RECLASS_NAME = 'reclass'
-DESCRIPTION = 'merge data by recursive descent down an ancestry hierarchy'
-VERSION = '1.4.1'
-AUTHOR = 'martin f. krafft'
-AUTHOR_EMAIL = 'reclass@pobox.madduck.net'
-MAINTAINER = 'Jason Ritzke (@Rtzq0)'
-MAINTAINER_EMAIL = 'jasonritzke@4loopz.com'
-COPYRIGHT = 'Copyright © 2007–14 ' + AUTHOR
+DESCRIPTION = 'merge data by recursive descent down an ancestry hierarchy (forked extended version)'
+VERSION = '1.5.2'
+AUTHOR = 'martin f. krafft / Andrew Pickford / salt-formulas community'
+AUTHOR_EMAIL = 'salt-formulas@freelists.org'
+MAINTAINER = 'salt-formulas community'
+MAINTAINER_EMAIL = 'salt-formulas@freelists.org'
+COPYRIGHT = 'Copyright © 2007–14 martin f. krafft, extensions © 2017 Andrew Pickford, extensions © salt-formulas community'
LICENCE = 'Artistic Licence 2.0'
-URL = 'https://github.com/madduck/reclass'
+URL = 'https://github.com/salt-formulas/reclass'
diff --git a/releasenotes/config.yaml b/releasenotes/config.yaml
new file mode 100644
index 0000000..6a6923d
--- /dev/null
+++ b/releasenotes/config.yaml
@@ -0,0 +1,66 @@
+---
+# Usage:
+#
+# reno -qd .releasenotes list
+# reno -qd .releasenotes new slug-title --edit
+# reno -qd .releasenotes report --no-show-source
+
+# Change prelude_section_name to 'summary' from default value prelude
+prelude_section_name: summary
+show_source: False
+sections:
+ - [summary, Summary]
+ - [features, New features]
+ - [fixes, Bug fixes]
+ - [others, Other notes]
+template: |
+ ---
+ # Author the following sections or remove the section if it is not related.
+ # Use one release note per a feature.
+ #
+ # If you miss a section from the list below, please first submit a review
+ # adding it to .releasenotes/config.yaml.
+ #
+ # Format content with reStructuredText (RST).
+ # **Formatting examples:**
+ # - |
+ # This is a brief description of the feature. It may include a
+ # number of components:
+ #
+ # * List item 1
+ # * List item 2.
+ # This code block below will appear as part of the list item 2:
+ #
+ # .. code-block:: yaml
+ #
+ # classes:
+ # - system.class.to.load
+ #
+ # The code block below will appear on the same level as the feature
+ # description:
+ #
+ # .. code-block:: text
+ #
+ # provide model/formula pillar snippets
+
+
+ summary: >
+ This section is not mandatory. Use it to highlight the change.
+
+ features:
+ - Use the list to record summary of **NEW** features
+ - Provide detailed description of the feature indicating the use cases
+ when users benefit from using it
+ - Provide steps to deploy the feature (if the procedure is complicated
+ indicate during what stage of the deployment workflow it should be
+ deployed).
+ - Provide troubleshooting information, if any.
+
+ fixes:
+ - Use the list to record summary of a bug fix for blocker, critical.
+ - Provide a brief summary of what has been fixed.
+
+ others:
+ - Author any additional notes. Use this section if note is not related to
+ any of the common sections above.
+
diff --git a/releasenotes/notes/escaping-references-e76699d8ca010013.yaml b/releasenotes/notes/escaping-references-e76699d8ca010013.yaml
new file mode 100644
index 0000000..41845ee
--- /dev/null
+++ b/releasenotes/notes/escaping-references-e76699d8ca010013.yaml
@@ -0,0 +1,3 @@
+---
+others:
+ - The escaping of references changes how the constructs '\${xxx}' and '\\${xxx}' are rendered.
diff --git a/requirements.txt b/requirements.txt
index c3726e8..ea72e95 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1 +1,3 @@
+pyparsing
pyyaml
+pygit2
diff --git a/setup.cfg b/setup.cfg
index d645be7..2f5e543 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -3,3 +3,6 @@
# 3. If at all possible, it is good practice to do this. If you cannot, you
# will need to generate wheels for each Python version that you support.
universal=0
+
+[install]
+prefix: /usr
diff --git a/setup.py b/setup.py
index 3830b84..2fb77ae 100644
--- a/setup.py
+++ b/setup.py
@@ -37,8 +37,8 @@
license = LICENCE,
url = URL,
packages = find_packages(exclude=['*tests']), #FIXME validate this
- entry_points = { 'console_scripts': console_scripts },
- install_requires = ['pyyaml'],
+ entry_point = { 'console_scripts': console_scripts },
+ install_requires = ['pyparsing', 'pyyaml'], #FIXME pygit2 (require libffi-dev, libgit2-dev 0.26.x )
classifiers=[
'Development Status :: 4 - Beta',