Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.1.0 #3

Open
wants to merge 673 commits into
base: master
Choose a base branch
from
Open

v0.1.0 #3

wants to merge 673 commits into from

Conversation

Bengt
Copy link
Member

@Bengt Bengt commented Nov 20, 2013

  • optimizers
    • SAES
    • Rechenberg
  • plugins
    • visualization
    • timeout
  • tests
    • unit
    • integration
    • system
    • examples
  • complete documentation
    • getting started
    • examples
    • API reference
    • README.rst
  • complete refactoring

Not part of this release

  • subinvoker
  • webinterface

renke and others added 30 commits March 16, 2014 16:25
Also return Exception object from (instead of string).
…imeout

Also improve print plugin in case of errors.
mocking a return type for tuple packaging
Canditates have their first character removed if it is ".".
Conflicts:
	metaopt/invoker/multiprocess.py
Conflicts:
	metaopt/invoker/util/task_worker_db.py
* The timeout is now cancelled when the optimization is already
  finished.

* Handle the case where the optimization finished without finding any
  paramters.
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 8812055 on develop into * on master*.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 5804487 on develop into * on master*.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling a395f78 on develop into * on master*.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 67f4ffc on develop into * on master*.

renke and others added 3 commits June 30, 2014 11:16
diff --git a/metaopt/core/paramspec/paramspec.py b/metaopt/core/paramspec/paramspec.py
index cbd6c79..9c00787 100644
--- a/metaopt/core/paramspec/paramspec.py
+++ b/metaopt/core/paramspec/paramspec.py
@@ -59,6 +59,10 @@ class ParamSpec(object):

         return ordered_params

+    @Property
+    def dimensions(self):
+        return len(self.params.values())
+
     def add_param(self, param):
         """Add a param to this param_spec object manually"""
         if param.name in self.params:
diff --git a/metaopt/tests/unit/core/param/paramspec.py b/metaopt/tests/unit/core/param/paramspec.py
index a4ba7b3..3e948b0 100644
--- a/metaopt/tests/unit/core/param/paramspec.py
+++ b/metaopt/tests/unit/core/param/paramspec.py
@@ -121,5 +121,15 @@ class TestParamspec(object):
         assert param_spec.params["b"].title == "β"
         assert param_spec.params["g"].title == "γ"

+    def test_dimensions_given_one_parameters(self):
+        param_spec = ParamSpec()
+
+        param_spec.int("a", interval=(1, 10))
+        param_spec.float("b", interval=(0, 1))
+        param_spec.bool("g")
+
+        assert param_spec.dimensions == 3
+
+
 if __name__ == '__main__':
     nose.runmodule()
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 93b2c2d on develop into * on master*.

diff --git a/metaopt/optimizer/saes.py b/metaopt/optimizer/saes.py
index 4bec993..d6e984a 100644
--- a/metaopt/optimizer/saes.py
+++ b/metaopt/optimizer/saes.py
@@ -7,7 +7,7 @@ from __future__ import absolute_import, division, print_function, \
     unicode_literals, with_statement

 # Standard Library
-from math import exp
+from math import exp, sqrt
 from random import gauss, sample

 # First Party
@@ -33,12 +33,10 @@ class SAESOptimizer(Optimizer):
     run indefinitely.

     """
-    MU = 15
+    MU = 15
     LAMBDA = 100
-    TAU0 = 0.5
-    TAU1 = 0.5

-    def __init__(self, mu=MU, lamb=LAMBDA, tau0=TAU0, tau1=TAU1):
+    def __init__(self, mu=MU, lamb=LAMBDA, tau0=None, tau1=None):
         """
         :param mu: Number of parent arguments
         :param lamb: Number of offspring arguments
@@ -50,6 +48,7 @@ class SAESOptimizer(Optimizer):
         # TODO: Make sure these value are sane
         self.mu = mu
         self.lamb = lamb
+
         self.tau0 = tau0
         self.tau1 = tau1

@@ -69,6 +68,14 @@ class SAESOptimizer(Optimizer):
         self._invoker = invoker
         self.param_spec = param_spec

+        N = self.param_spec.dimensions
+
+        if self.tau0 is None:
+            self.tau0 = 1 / sqrt(2 * N)
+
+        if self.tau1 is None:
+            self.tau1 = 1 / sqrt(2 * sqrt(N))
+
         self.initalize_population()
         self.score_population()
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 5554cc1 on develop into * on master*.

jpzk and others added 3 commits June 30, 2014 11:39
diff --git a/metaopt/core/arg/util/creator.py b/metaopt/core/arg/util/creator.py
index 4ff02d0..b108417 100644
--- a/metaopt/core/arg/util/creator.py
+++ b/metaopt/core/arg/util/creator.py
@@ -20,13 +20,16 @@ class ArgsCreator(object):
         self.param_spec = param_spec

     def args(self, values=None):
-        """Returns an args derived from the params given on instantiation.Given values, uses values"""
-	param_values = self.param_spec.params.values()
-        if values == None:
+        """Returns an args derived from the params given on instantiation. Given
+         values, uses values"""
+        param_values = self.param_spec.params.values()
+
+        if values == None:
             return [create_arg(param) for param in param_values]
-	else:
-	    mapping = zip(param_values, values)
-            return [create_arg(param, value) for param, value in mapping]
+        else:
+            mapping = zip(param_values, values)
+
+        return [create_arg(param, value) for param, value in mapping]

     def random(self):
         """Returns a randomized version of self.args()."""
diff --git a/metaopt/optimizer/gridsearch.py b/metaopt/optimizer/gridsearch.py
index 227aafd..6f2e9c5 100644
--- a/metaopt/optimizer/gridsearch.py
+++ b/metaopt/optimizer/gridsearch.py
@@ -14,7 +14,7 @@ class GridSearchOptimizer(Optimizer):

     def __init__(self):
         super(GridSearchOptimizer, self).__init__()
-        self.best = (None, None)
+        self.best = (None, None) # (args, fitness)

     def optimize(self, invoker, param_spec, return_spec=None):
         del return_spec  # TODO
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling b979e52 on develop into * on master*.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling b979e52 on develop into * on master*.

diff --git a/metaopt/optimizer/saes.py b/metaopt/optimizer/saes.py
index e847e00..a0f021f 100644
--- a/metaopt/optimizer/saes.py
+++ b/metaopt/optimizer/saes.py
@@ -57,7 +57,7 @@ class SAESOptimizer(Optimizer):

         self.population = []
         self.scored_population = []
-        self.best_scored_indivual = (None, None)
+        self.best_scored_individual = (None, None)

         self.aborted = False
         self.generation = 1
@@ -89,13 +89,13 @@ class SAESOptimizer(Optimizer):
             self.score_population()

             if self.aborted:
-                return self.best_scored_indivual[0][0]
+                return self.best_scored_individual[0][0]

             self.select_parents()

             self.generation += 1

-        return self.best_scored_indivual[0][0]
+        return self.best_scored_individual[0][0]

     def exit_condition(self):
         pass
@@ -119,14 +119,14 @@ class SAESOptimizer(Optimizer):
             mean = lambda x1, x2: float((x1 + x2) / 2)
             child_args_sigma = map(mean, mother[1], father[1])

-            child_args = ArgsModifier.randomize(child_args, child_args_sigma)
+            child_args = ArgsModifier.mutate(child_args, child_args_sigma)

             self.tau0_random = gauss(0, 1)

             def mutate_sigma(sigma):
-                tau0_randomized = self.tau0 * self.tau0_random
-                tau1_randomized = self.tau1 * gauss(0, 1)
-                return sigma * exp(tau0_randomized) * exp(tau1_randomized)
+                tau0_mutated = self.tau0 * self.tau0_random
+                tau1_mutated = self.tau1 * gauss(0, 1)
+                return sigma * exp(tau0_mutated) * exp(tau1_mutated)

             child_args_sigma = map(mutate_sigma, child_args_sigma)

@@ -162,10 +162,10 @@ class SAESOptimizer(Optimizer):
         scored_individual = (individual, fitness)
         self.scored_population.append(scored_individual)

-        _, best_fitness = self.best_scored_indivual
+        _, best_fitness = self.best_scored_individual

         if best_fitness is None or fitness < best_fitness:
-            self.best_scored_indivual = scored_individual
+            self.best_scored_individual = scored_individual

     def on_error(self, value, fargs, individual, **kwargs):
         del value  # TODO
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling b7abaa8 on develop into * on master*.

renke added 3 commits July 9, 2014 13:49
It's no longer needed because we have a working multiprocessing invoker.
I fixed a lot of broken links and code examples.
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling d24ae97 on develop into * on master*.

renke and others added 3 commits July 23, 2014 20:16
Objective functions can now receive extra keyword arguments.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants