Quantcast
Channel: AFPy's Planet
Viewing all 3409 articles
Browse latest View live

[novapost] Python decorators made easy

$
0
0

A Python decorator is, basically, a function that take a function as argument and return a function. This is a powerful feature. But it has some drawbacks:

  • decorators are quite tricky to develop. Of course they are for Python newbies. But, as an experienced Python developer, I must admit I also have to think every time I code a decorator. I often feel the decorator pattern is a bit complex.
  • decorators that take arguments are even more tricky to develop. They are decorator factories, aka "functions that return a function that take a function and return a function". Inception WTF?
  • decorators without arguments are used without parentheses, whereas decorators with arguments require parentheses, even if you pass empty arguments. So when using a decorator, you have to wonder whether it takes arguments or not. A bit more to think about everytime you use a decorator.
  • last but not least, function-based decorators are hard to test, because they return functions and you can't easily check internals. How can you check the state of the decorator after it decorated the function, but before you actually run it? Classes are really helpful for that.

This article introduces class-based decorators for Python, as a convenient way to develop and use Python decorators.

The examples described below are available as a Python file at https://gist.github.com/benoitbryon/5168914

Decorators are tricky to develop and use

As a reminder of drawbacks of decorators, here are some examples. If you are aware of those facts, feel free to jump to the next section.

Here is a simple decorator which prints "moo" then executes input function:

def moo(func):
    def decorated(*args, **kwargs):
        print 'moo'
        return func(*args, **kwargs)  # Run decorated function.
    return decorator

You use it like this:

>>> @moo
... def i_am_a(kind):
...     print "I am a {kind}".format(kind=kind)
>>> i_am_a("duck")
'moo'
'I am a duck'

Here is the same decorator, which allows you configure the value to print:

def speak(word='moo'):
    def decorator(func):
        def decorated(*args, **kwargs):
            print word
            return func(*args, **kwargs)
        return decorated
    return decorator

You use it like that:

>>> @speak('quack')
... def i_am_a(kind):
...     print "I am a {kind}".format(kind=kind)
>>> i_am_a("duck")
'quack'
'I am a duck'

If you want to use default arguments for "speak", you have to use empty parentheses, i.e. you can't write @speak like you used @moo, you have to write @speak() instead.

I won't tell more about decorators here, there are plenty of articles about them on the web. I just wanted to highlight the fact that even simplest decorators are not as simple as they pretend.

But they could be! Let's introduce class-based decorators...

Hello world example

Here is a sample usage of the Decorator class:

class Greeter(Decorator):
    """Greet return value of decorated function."""
    def setup(self, greeting='hello'):
        self.greeting = greeting

    def run(self, *args, **kwargs):
        name = super(Greeter, self).run(*args, **kwargs)
        return '{greeting} {name}!'.format(greeting=self.greeting, name=name)

The implementation is pretty simple, isn't it? So is the usage!

As a Decorator, you can use it without options.

>>> @Greeter
... def world():
...     return 'world'
>>> world()
'hello world!'

The example above is the same as providing empty options.

>>> @Greeter()
... def world():
...     return 'world'
>>> world()
'hello world!'

It accepts one greeting option:

>>> @Greeter(greeting='goodbye')
... def world():
...     return 'world'
>>> world()
'goodbye world!'

greeting option defaults to 'hello':

>>> my_greeter = Greeter()
>>> my_greeter.greeting
'hello'

You can create a Greeter instance for later use:

>>> my_greeter = Greeter(greeting='hi')
>>> @my_greeter
... def world():
...     return 'world'
>>> world()
'hi world!'

Which gives you an opportunity to setup the greeter yourself:

>>> my_greeter = Greeter()
>>> my_greeter.greeting = 'bonjour'
>>> @my_greeter
... def world():
...     return 'world'
>>> world()
'bonjour world!'

In this example, all arguments are proxied to the decorated function:

>>> @Greeter
... def name(value):
...     return value
>>> name('world')
'hello world!'

>>> @Greeter(greeting='goodbye')
... def names(*args):
...     return ' and '.join(args)
>>> names('Laurel', 'Hardy')
'goodbye Laurel and Hardy!'

Wrapping functions with functools

functools [1] provides utilities to "wrap" a function, i.e. make the decorator return value look like the original function.

Here is another class-based decorator sample. It adds "functools.update_wrapper" features to Decorator:

import functools

class Chameleon(Decorator):
    """A Decorator that looks like decorated function.

    It uses ``functools.update_wrapper``.

    This is a base class which acts as a transparent proxy for the
    decorated function. Consider overriding the ``run()`` method.

    .. warning::

       Take care of what you pass in ``assigned`` or ``updated``: you could
       break the Chameleon itself. As an example, you should not pass "assigned",
       "run" or "__call__" in ``assigned``, except you know what you are doing.

    """
    def setup(self,
              assigned=functools.WRAPPER_ASSIGNMENTS,
              updated=functools.WRAPPER_UPDATES):
        self.assigned = assigned
        self.updated = updated

    def decorate(self, func):
        """Make self wrap the decorated function."""
        super(Chameleon, self).decorate(func)
        functools.update_wrapper(self, func,
                                 assigned=self.assigned,
                                 updated=self.updated)

Again, the implementation is pretty simple.

Let's look at the result...

>>> @Chameleon
... def documented():
...     '''Fake function with a docstring.'''
>>> documented.__doc__
'Fake function with a docstring.'

It accepts options assigned and updated, that are proxied to functools.update_wrapper.

Default values are functools.WRAPPER_ASSIGNMENTS for assigned and empty tuple for updated.

>>> def hello():
...    '''Hello world!'''
>>> wrapped = Chameleon(hello)
>>> wrapped.assigned
('__module__', '__name__', '__doc__')
>>> wrapped.updated
('__dict__',)
>>> wrapped.__doc__ == hello.__doc__
True
>>> wrapped.__name__ == hello.__name__
True

>>> only_doc_wrapped = Chameleon(hello, assigned=['__doc__'])
>>> only_doc_wrapped.__doc__ == hello.__doc__
True
>>> only_doc_wrapped.__name__ == hello.__name__  # Doctest: +ELLIPSIS
Traceback (most recent call last):
    ...
AttributeError: 'Chameleon' object has no attribute '__name__'

>>> hello.__dict__ = {'some_attribute': 'some value'}  # Best on an object.
>>> attr_wrapped = Chameleon(hello, updated=['__dict__'])
>>> attr_wrapped.updated
['__dict__']
>>> attr_wrapped.some_attribute
'some value'

Here we have a good replacement for decorators using functools.wraps.

As an example, the django-traditional-style decorator shown in Testing Django view decorators article...

from functools import wraps
from django.utils.decorators import available_attrs

def authenticated_user_passes_test(test_func,
                                   unauthorized=UnauthorizedView.as_view(),
                                   forbidden=ForbiddenView.as_view()):
    """Make sure user is authenticated and passes test."""
    def decorator(view_func):
        @wraps(view_func, assigned=available_attrs(view_func))
        def _wrapped_view(request, *args, **kwargs):
            if not request.user.is_authenticated():
                return unauthorized(request)
            if not test_func(request.user):
                return forbidden(request)
            return view_func(request, *args, **kwargs)

... would be written like this with class-based-style:

class authenticated_user_passes_test(Chameleon):
    """Make sure user is authenticated and passes test."""
    def setup(self, **kwargs):
        try:
            self.test_func = kwargs.pop('test_func')
        except KeyError:
            raise TypeError('decorator requires "test_func" keyword argument')
        self.unauthorized = kwargs.pop('unauthorized', UnauthorizedView.as_view())
        self.forbidden = kwargs.pop('forbidden', ForbiddenView.as_view())
        super(authenticated_user_passes_test, self).setup(**kwargs)

    def run(self, request, *args, **kwargs):
        if not request.user.is_authenticated():
            return self.unauthorized(request)
        if not self.test_func(request.user):
            return self.forbidden(request)
        return = super(authenticated_user_passes_test, self).run(request, *args, **kwargs)

The class-based way is a bit longer because we have to handle the required test_func manually. But it clearly separates setup from run. It makes the code readable, easy to test, and easy to extend.

The Decorator class

Here is the base class. Little magic inside.

class Decorator(object):
    """Base class to easily create convenient decorators.

    Override :py:meth:`setup`, :py:meth:`run` or :py:meth:`decorate` to create
    custom decorators.

    Decorator instances are callables. The :py:meth:`__call__` method has a
    special implementation in Decorator. Generally, consider overriding
    :py:meth:`run` instead of :py:meth:`__call__`.

    """
    def __init__(self, func=None, **kwargs):
        """Constructor.

        Accepts one optional positional argument: the function to decorate.

        Other arguments **must** be keyword arguments.

        And beware passing ``func`` as keyword argument: it would be used as
        the function to decorate.

        """
        self.setup(**kwargs)
        if func is not None:
            self.decorate(func)

    def decorate(self, func):
        """Remember the function to decorate."""
        self.decorated = func

    def setup(self, **kwargs):
        """Store decorator's options."""
        self.options = kwargs

    def __call__(self, *args, **kwargs):
        """Run decorated function if available, else decorate first arg."""
        try:
            self.decorated
        except AttributeError:
            func = args[0]
            self.decorate(func)
            return self
        else:
            return self.run(*args, **kwargs)

    def run(self, *args, **kwargs):
        """Actually run the decorator.

        This base implementation is a transparent proxy to the decorated
        function: it passes positional and keyword arguments as is, and returns
        result.

        """
        return self.decorated(*args, **kwargs)

This base class transparently proxies to decorated function:

>>> @Decorator
... def return_args(*args, **kwargs):
...    return (args, kwargs)
>>> return_args()
((), {})
>>> return_args(1, 2, three=3)
((1, 2), {'three': 3})

This base class stores decorator's options in options dictionary. But it doesn't use it... it's just a convenient mechanism for subclasses.

>>> @Decorator
... def nothing():
...    pass
>>> nothing.options
{}

>>> @Decorator()
... def nothing():
...    pass
>>> nothing.options
{}

>>> @Decorator(one=1)
... def nothing():
...    pass
>>> nothing.options
{'one': 1}

Limitations

This Decorated implementation has some limitations:

  • you can't use positional arguments to configure the decorator itself.
  • required decorator arguments must be handled explicitely, because you can't use positional arguments.
  • you have to remember the Decorator API, mainly setup() and run() methods.

Are there other limitations? Right now, I don't know...

Benefits

  • As a decorator author, you focus on setup() and run(). It is easy to remember. It produces readable code.
  • As a decorator user, you don't bother with parentheses. You just use the decorator depending on your needs, and it works.
  • As a test writer, you can write tests for decorators internals: what is the state of the decorator after its own initialization, what is its state after a run...

Would you use it?

See also

[1]http://docs.python.org/2.7/library/functools.html

[Biologeek] Une quête de sens

$
0
0

Je suis (non-)intervenu à SudWeb 2013 pour animer un débat sur le sens de notre implication dans notre métier suite à une informelle donnée à ParisWeb 2012. J'avais choisi des thématiques très larges telles que l'argent, l'utilité, la reconnaissance, l'adrénaline, le partage, la santé, l'écologie ou le fun. L'objectif était d'avoir un débat non technique et de faire se poser quelques questions à l'auditoire sur les choix qu'ils font professionnellement (et finalement personnellement aussi) au cours de leur vie. Petite rétrospective sur cette heure d'échanges à plus de 100.

Ce qui a bien marché

  • le fait d'énoncer des règles claires en début de session a permis de ne pas avoir de monopolisation de la parole par un seul petit groupe de personnes (après discussion avec le staff de SudWeb, ils faisaient attention dans la distribution des micros à répartir équitablement la parole donc cela ne se faisait pas tout seul non plus ;-)) ;
  • la disposition de la salle a beaucoup fait pour que le débat soit possible et qu'une dynamique de communication en face-à-face se mette en place ;
  • la participation a été assez exceptionnelle avec un démarrage immédiat du débat (je stressais un peu de me retrouver avec une salle muette) et un engagement qui n'a pas faibli au cours de la session ;
  • la variabilité dans les points de vues et les niveaux de recul des participants, c'était à titre personnel assez rafraichissant d'avoir des remarques qui partaient un peu dans tous les sens sans forcément suivre les thématiques proposées ;
  • mon silence, j'ai réussi à me retenir plusieurs fois d'intervenir pour laisser la parole à la salle, c'est extrêmement frustrant mais je pense que cela a été bénéfique au débat ;
  • les discussions que cela a produit au cours des jours suivants, je n'avais jamais eu autant de retours suite à une intervention et au-delà des retours personnels j'ai pu observer de nombreux échanges — en périphérie des sessions — relatifs au débat ce qui ne peut que me ravir.

Ce qui pourrait être amélioré

  • beaucoup de consensualité dans les échanges et j'ai du mal à trouver comment est-ce que cela pourrait évoluer, ni même si ça doit l'être. Ma crainte est plus dans le mode « bisounours » activé par la prise de parole en public ;
  • l'introduction des thèmes à partir de questions focalisait la salle sur les questions et peu sur la thématique au sens large, c'est dommage mais je n'ai pas trouvé mieux pour lancer les sujets ;
  • le sujet sur l'adrénaline n'a pas été compris par tout le monde et j'ai eu des retours très contrastés par la suite (certains ne s'y retrouvant pas du tout et d'autres à fond), l'objectif était surtout de faire une pause dans les sujets plutôt lourds qui étaient discutés avant et après ;
  • le sentiment de frustration vu le nombre de personnes qui souhaitent s'exprimer mais c'est le jeu, on avait 60 minutes pour une centaine de personnes ça fait quelques secondes seulement par participant…

Et la suite ?

Une question ouverte en guise de conclusion sans avoir vraiment de proposition technique concrète pour continuer le débat. Après réflexion (et de nombreuses discussions), je ne pense pas qu'il soit pertinent de continuer en ligne par contre je serais ravi que les discussions continuent ici ou ailleurs en espérant avoir semé quelques graines qui pourront germer de proche en proche.

[anybox] Python : compréhension des intensions en 1 minute

[tarek] A step-by-step introduction to Circus

$
0
0

Note

Circus is a process & socket manager. See https://circus.readthedocs.org

https://farm9.staticflickr.com/8420/8751753401_0760d37279.jpg

Photo by kennethreitz

During Django Con, I was asked how to use Circus to run & monitor a Python web application. The documentation has no single page step-by-step tutorial yet, so here goes... this blog post will be integrated into the documentation for the next release.

Installation

Circus is tested under Mac OS X and Linux, on the latest Python 2.6 and 2.7. To run a full Circus, you will also need libzmq, libevent & virtualenv.

Under Debuntu:

$ sudo apt-get install libzmq-dev libevent python-virtualenv

Create a virtualenv and install circus, circus-web and chaussette in it

$ virtualenv /tmp/circus
$ cd /tmp/circus
$ bin/pip install circus
$ bin/pip install circus-web
$ bin/pip install chaussette

Once this is done, you'll find a plethora of commands in the local bin dir.

Usage

Chaussette comes with a default Hello world app, try to run it:

$ bin/chaussette

You should be able to visit http://localhost:8080 and see hello world.

Stop Chaussette and add a circus.ini file in the directory containing:

[circus]
stats_endpoint = tcp://127.0.0.1:5557
httpd = 1

[watcher:webapp]
cmd = bin/chaussette --fd $(circus.sockets.web)
numprocesses = 3
use_sockets = True

[socket:web]
host = 127.0.0.1
port = 9999

This config file tells Circus to bind a socket on port 9999 and run 3 chaussettes workers against it. It also activates the Circus web dashboard and the statistics module.

Save it & run it using circusd:

$ bin/circusd --daemon circus.ini

Now visit http://127.0.0.1:9999, you should see the hello world app.

You can also visit http://localhost:8080/ and enjoy the Circus web dashboard.

Interaction

Let's use the circusctl shell while the system is running:

$ bin/circusctl
circusctl 0.7.1
circusd-stats: active
circushttpd: active
webapp: active
(circusctl)

You get into an interactive shell. Type help to get all commands:

(circusctl) help

Documented commands (type help <topic>):
========================================
add     get            list         numprocesses  quit     rm      start   stop
decr    globaloptions  listen       numwatchers   reload   set     stats
dstats  incr           listsockets  options       restart  signal  status

Undocumented commands:
======================
EOF  help

Let's try basic things. Let's list the web workers processes and add a new one:

(circusctl) list webapp
13712,13713,13714
(circusctl) incr webapp
4
(circusctl) list webapp
13712,13713,13714,13973

Congrats, you've interacted with your Circus! Get off the shell with Ctrl+D and now run circus-top:

$ bin/circus-top

This is a top-like command to watch all your processes' memory and CPU usage in real time.

Hit Ctrl+C and now let's quit Circus completely via circus-ctl:

$ bin/circusctl quit
ok

Next steps

You can plug your own WSGI application instead of Chaussette's hello world simply by pointing the application callable.

Chaussette also comes with many backends like Gevent or Meinheld.

Read https://chaussette.readthedocs.org/ for all options.

[carlchenet] Vrac de mini-messages n°2

$
0
0
Suivez-moi aussi sur Identi.ca ou sur Twitter. Pour cette catégorie d’article, je passe à une publication hebdomadaire Les liens d’origine sont enrichis des approfondissements que j’ai pu effectuer entre la publication du dent/tweet et la publication de cet article. #debian #wheezy 7.1 devrait être publiée samedi 15 juin http://ur1.ca/dvraj  => Information très intéressante qui n’a pas été beaucoup relayée. La […]

[afpy.org] Solution linux 2013 à Paris

$
0
0
L'AFPy sera présente au salon Solution Linux le 28 et 29 mai.

[afpyro] AFPyro à Pau - le 23 mai 2013

$
0
0

Un Afpyro aura lieu le jeudi 23 mai à partir de 19h30

au club d’échecs Henri IV

39 ter rue E. Guichenné 64000 PAU

Plan OSM

Le vénérable cyp vient nous causer de ce qu’il fait dans sa boite avec pyramid !

Enjoy

[afpyro] AFPyro à Lyon - le 27 Février 2013

$
0
0

Un Afpyro aura lieu le mercredi 27 février à partir de 20h à l’Antre Autre - 11 rue Terme - 69001 Lyon.

Une présentation sur Salt sera donnée par Gaston Tjebbes. Salt est un outil permettant de gérer des serveurs et écrit en Python.

L’Antre Autre est un lieu où nous pouvons discuter autour d’un verre, et, pour ceux qui le souhaitent, prendre un repas.

Pour se rendre à l’Antre Autre :
  • en métro : arrêt Hôtel de Ville
  • en bus : lignes C13 et C18 arrêt Mairie du 1er ou lignes 19, C14 et C3 à l’arrêt Terreaux
  • en vélo’v : stations Place Sathonay, Carmélites Burdeau, Place de la paix

[carlchenet] Vrac de mini-messages n°3 : Erlang, Python, Django, Debian GNU/Hurd, SSH

$
0
0
Suivez-moi aussi sur Identi.ca ou sur Twitter. Voici les dents/tweets intéressants de la semaine dernière. Au menu Erlang, Python, Django, Debian GNU/Hurd et du SSH : retour d’xp d’une migration #erlang vers #python http://ur1.ca/e1qmm => dans le cadre de la haute disponibilité, on observe plutôt en général le contraire qui se produit. Les arguments abordés par l’auteur sont toutefois très […]

[logilab] LMGC90 Sprint at Logilab in March 2013

$
0
0

LMGC90 Sprint at Logilab

At the end of March 2013, Logilab hosted a sprint on the LMGC90 simulation code in Paris.

LMGC90 is an open-source software developed at the LMGC ("Laboratoire de Mécanique et Génie Civil" -- "Mechanics and Civil Engineering Laboratory") of the CNRS, in Montpellier, France. LMGC90 is devoted to contact mechanics and is, thus, able to model large collections of deformable or undeformable physical objects of various shapes, with numerous interaction laws. LMGC90 also allows for multiphysics coupling.

Sprint Participants

https://www.logilab.org/file/143585/raw/logo_LMGC.jpg https://www.logilab.org/file/143749/raw/logo_SNCF.jpg https://www.logilab.org/file/143750/raw/logo_LaMSID.jpg https://www.logilab.org/file/143751/raw/logo_LOGILAB.jpg

More than ten hackers joined in from:

  • the LMGC, which leads LMCG90 development and aims at constantly improving its architecture and usability;
  • the Innovation and Research Department of the SNCF (the French state-owned railway company), which uses LMGC90 to study railway mechanics, and more specifically, the ballast;
  • the LaMSID ("Laboratoire de Mécanique des Structures Industrielles Durables", "Laboratory for the Mechanics of Ageing Industrial Structures") laboratory of the EDF / CNRS / CEA , which has an strong expertise on Code_ASTER and LMGC90;
  • Logilab, as the developer, for the SNCF, of a CubicWeb-based platform dedicated to the simulation data and knowledge management.

After a great introduction to LMGC90 by Frédéric Dubois and some preliminary discussions, teams were quickly constituted around the common areas of interest.

Enhancing LMGC90's Python API to build core objects

As of the sprint date, LMGC90 is mainly developed in Fortran, but also contains Python code for two purposes:

  • Exposing the Fortran functions and subroutines in the LMGC90 core to Python; this is achieved using Fortran 2003's ISO_C_BINDING module and Swig. These Python bindings are grouped in a module called ChiPy.
  • Making it easy to generate input data (so called "DATBOX" files) using Python. This is done through a module called Pre_LMGC.

The main drawback of this approach is the double modelling of data that this architecture implies: once in the core and once in Pre_LMGC.

It was decided to build a unique user-level Python layer on top of ChiPy, that would be able to build the computational problem description and write the DATBOX input files (currently achieved by using Pre_LMGC), as well as to drive the simulation and read the OUTBOX result files (currently by using direct ChiPy calls).

This task has been met with success, since, in the short time span available (half a day, basically), the team managed to build some object types using ChiPy calls and save them into a DATBOX.

Using the Python API to feed a computation data store

This topic involved importing LMGC90 DATBOX data into the numerical platform developed by Logilab for the SNCF.

This was achieved using ChiPy as a Python API to the Fortran core to get:

  • the bodies involved in the computation, along with their materials, behaviour laws (with their associated parameters), geometries (expressed in terms of zones);
  • the interactions between these bodies, along with their interaction laws (and associated parameters, e.g. friction coefficient) and body pair (each interaction is defined between two bodies);
  • the interaction groups, which contain interactions that have the same interaction law.

There is still a lot of work to be done (notably regarding the charges applied to the bodies), but this is already a great achievement. This could only have occured in a sprint, were every needed expertise is available:

  • the SNCF experts were there to clarify the import needs and check the overall direction;

  • Logilab implemented a data model based on CubicWeb, and imported the data using the ChiPy bindings developed on-demand by the LMGC core developer team, using the usual-for-them ISO_C_BINDING/ Swig Fortran wrapping dance.

    https://www.logilab.org/file/143753/raw/logo_CubicWeb.jpg
  • Logilab undertook the data import; to this end, it asked the LMGC how the relevant information from LMGC90 can be exposed to Python via the ChiPy API.

Using HDF5 as a data storage backend for LMGC90

The main point of this topic was to replace the in-house DATBOX/OUTBOX textual format used by LMGC90 to store input and output data, with an open, standard and efficient format.

Several formats have been considered, like HDF5, MED and NetCDF4.

MED has been ruled out for the moment, because it lacks the support for storing body contact information. HDF5 was chosen at last because of the quality of its Python libraries, h5py and pytables, and the ease of use tools like h5fs provide.

https://www.logilab.org/file/143754/raw/logo_HDF.jpg

Alain Leufroy from Logilab quickly presented h5py and h5fs usage, and the team started its work, measuring the performance impact of the storage pattern of LMGC90 data. This was quickly achieved, as the LMGC experts made it easy to setup tests of various sizes, and as the Logilab developers managed to understand the concepts and implement the required code in a fast and agile way.

Debian / Ubuntu Packaging of LMGC90

This topic turned out to be more difficult than initially assessed, mainly because LMGC90 has dependencies to non-packaged external libraries, which thus had to be packaged first:

  • the Matlib linear algebra library, written in C,
  • the Lapack95 library, which is a Fortran95 interface to the Lapack library.

Logilab kept working on this after the sprint and produced packages that are currently being tested by the LMGC team. Some changes are expected (for instance, Python modules should be prefixed with a proper namespace) before the packages can be submitted for inclusion into Debian. The expertise of Logilab regarding Debian packaging was of great help for this task. This will hopefully help to spread the use of LMGC90.

https://www.logilab.org/file/143755/raw/logo_Debian.jpg

Distributed Version Control System for LMGC90

As you may know, Logilab is really fond of Mercurial as a DVCS. Our company invested a lot into the development of the great evolve extension, which makes Mercurial a very powerful tool to efficiently manage the team development of software in a clean fashion.

This is why Logilab presented Mercurial's features and advantages over the current VCS used to manage LMGC90 sources, namely svn, to the other participants of the Sprint. This was appreciated and will hopefully benefit to LMGC90 ease of development and spread among the Open Source community.

https://www.logilab.org/file/143756/raw/logo_HG.jpg

Conclusions

All in all, this two-day sprint on LMGC90, involving participants from several industrial and academic institutions has been a great success. A lot of code has been written but, more importantly, several stepping stones have been laid, such as:

  • the general LMGC90 data access architecture, with the Python layer on top of the LMGC90 core;
  • the data storage format, namely HDF5.

Colaterally somehow, several other results have also been achieved:

  • partial LMGC90 data import into the SNCF CubicWeb-based numerical platform,
  • Debian / Ubuntu packaging of LMGC90 and dependencies.

On a final note, one would say that we greatly appreciated the cooperation between the participants, which we found pleasant and efficient. We look forward to finding more occasions to work together.

[afpy.org] Meetup Paris.PY @ Le Camping, Paris

$
0
0
Mini-présentations & bières & pizzas, pour tous ceux qui s'intéressent à Python.

[carlchenet] Vrac de mini-messages n°4 : Bootstrap, Django, Riemann, Shinken, migration Squeeze vers Wheezy et déploiement continu

$
0
0
Suivez-moi aussi sur Identi.ca ou sur Twitter. Comme chaque semaine, voici les dents/tweets intéressants de la semaine dernière que j’ai publié sur Identi.ca ou sur Twitter, revus et augmentés d’éventuels observations et commentaires mûris au cours de la semaine passé  Au menu Bootstrap pour Django, l’excellent framework web en Python, Riemann, Shinken, migration Debian Squeeze vers Wheezy et déploiement continu. thème #bootstrap pour #django […]

[tarek] Raspberry-Pi Ghetto Blaster Suitcase

$
0
0

tldr; I built a Ghetto blaster with a suit case. Click on the image below to see me dancing with it.

https://lh4.googleusercontent.com/-WsaYOOate7w/UaoqKI6guzI/AAAAAAAAD_o/iRZhFzaSlrw/w912-h604-no/balster+%25281+of+2%2529.jpg

After my Raspberry Juke box project was done, I wanted to take it to the next level and build a standalone amplified speaker I could drive from the home wifi instead of putting $300 in a Bose Soundlink or a Jawbone JAMBOX and get limited features compared with what I can build with a Raspberry.

So I build a Ghetto blaster with a suit case.

Ghetto blasters are one of the coolest thing ever. It's the perfect device to enjoy music outside - and it's so 80s.. :)

I found an old suit case in my basement that used to contain tools. This kind of suit case is made with cardboard and covered with aluminum. Once emptied, it's perfect as a speaker. The cardboard and aluminum vibrate and produce excellent basses. This suitcase costs around 10 euros.

I also found 2 old car speakers in my basement, that are pretty good. 25W & 3 channels each. I suspect these would cost arount 20 euros these days.

Once the holes were made and the speakers screwed on the suitcase panel, I bought a small 25W amplifier on Amazon for 27 euros. This thing is really amazing. It's small enough to fit in the suitcase and has a small equalizer that is really handy. I unscrewed the front panel and placed it outside on the suitcase, and screwed back through the suitcase to hold the amplifier inside.

I started to play with my suitcase and got amazed by the sound, it really kicks and has very good basses.

The next steps were to plug a Raspberry-Pi with an USB sound card and a wifi dongle and run Mopidy on it. That allowed me to stream music from my Spotify account.

When the Raspberry starts, it starts Mopidy, connects to the home Wifi and speaks out using espeak:

"I am ready to play music, my IP address is 192.168.0.16"

From there I can start a MPD client like MPDroid and connect to that IP and queue some music.

Powering

Of course the big challenge was to power up the amplifier & the Raspberry so I could actually walk around freely. I did not want to use lead acid, so I bought this 12v lipo battery for $20. It comes pre-charged and has a small on/off button.

Now this battery delivers 12v but I still need 5v for my Raspberry. You can use a voltage regulator for this, like the LM1117.

I built a small board you can see in the video. It takes the 12v from the battery and outputs 5v for the Raspberry. It has the LM1117 with a sink, and a few capacitors for stability.

It's exactly the same design as this one https://www.youtube.com/watch?v=CKS6zHo5T9k except they use a L7805 in there - which has a different wiring.

That's it - my 12v LiPO powers up the amplifier & the Raspberry. It's been playing for hours and the battery still has some juice.

Issues & next steps

The wifi dongle loses the signal if I close the suitcase and I am too far from the wifi router. I need to set up an external antenna.

I am also going to add a battery level indicator, using this schematic

One issue I have yet to solve is the ability to reconfigure the network setup in case I use the Ghetto blaster in someone else's house. Right now I have to plug a screen and a keyboard or to plug a network cable and ssh on the Raspberry to change the network config.

Maybe one way to solve this would be to have a second wifi dongle set as an access point, and a small web interface to configure the network.

Raspberry-Pis are so fun.

[tarek] A Pythoneers directory

$
0
0

I've been helping in the organization of Pycon France for years, --not recently though-- and I've also been lightly involved in the FOSDEM Python room this year.

There's one thing that frustrated me a lot in this exercise: the inability to reach out specific people in the Python community.

I was never really able to answer questions like this:

  • I want a list of all Numpy specialists in Europe that are willing to give a talk.
  • I want a list of all women developers that are willing to give a talk about Python in Europe.
  • I want a list of experts that can help me selecting the right talks for a given topic.
  • I want to know who loves Python in Burgundy!

Of course you can shout out on a blog, on twitter, or try to reach the right people through your network.

For the second question --reaching out women-- I've learnt too late that I should've worked with the PyLadies on this, or even the Ada initiative.

But this is too much work. I just want to reach out the right people without having to go through a lot of intermediates.

I want to have a single place on the internets where Bob and Sarah can add a list of topics they are involved into, tell if they are interested in giving talks, in what area of the world, etc.

I think it can't be Lanyard because they would probably not let me query their users database with specific queries like the ones I mentioned. It can't be Linkedin for the same reasons.

I was pointed to some directories that looked a bit like what I envisioned, but they were local directories that just listed Python people without giving all the info I am looking for.

So what I think the Python community is missing is a worldwide Pythoneers directory that is not affiliated to any group, organization, company or foundation -- were users can fill out a profile with their interests in Python.

So I am going to try to build that tool during the next Europython in Italy.

The features I want are pretty simple:

  • the ability to connect with your github or bitbucket account.
  • the ability to build a page about you, a little bit like about.me, but with predefined, optional fields like your gender, your location, the projects you are involved into, and so on.
  • the ability for someone to query & reach out a specific group of people.

Of course, this raises some issues, like the fact that it could be used by commercial companies to send some unsollicited e-mails like recruiting e-mails, or simple spams. But people are already used to deal with this as soon as they have a public life.

If you are interested in this, reach me out or add a comment. The goal will be to talk about the idea, refine it, and have a first version running by the end of Europython, and see if we get somewhere.

[afpy.org] Vous apprendrez bien un peu de Python le 10 juin 2013 à la Cantine à Paris

$
0
0
Dernière session de formation à la programmation avec Python pour débutants pour cette saison. Inscriptions obligatoires sur le site de la Cantine.

[raspberry-python] Brython a Toulon

[carlchenet] Technologies derrière un site web à base de Django aujourd’hui

$
0
0
Suivez-moi aussi sur Identi.ca ou sur Twitter. Comment concilier mon envie de développer un site web autour d’une idée originale (donc très motivante) et la partie technique associée qui me rebutait à priori, de par mon expérience du monde PHP/MySQL (qui date un peu je vous l’accorde) ? Mon intérêt pour le langage Python m’a naturellement poussé vers le […]

[cubicweb] OpenData meets the Semantic Web at WOD2013

$
0
0

With a few people from Logilab we went to the 2nd International Workshop on Open Data (WOD), on the 3rd of june.

Although the main focus was an academic take on OpenData, a lot of talks were related to the Semantic Web technologies and especially LinkedData.

http://www.logilab.org/file/144837/raw/banniere-wod2013.png

The full program (and papers) is on the following website. Here is a quick review of the things we though worth sharing.

  • privacy oriented ontologies : http://l2tap.org/
  • interesting automations done to suggest alignments when initial data is uploaded to an opendata website
  • some opendata platforms have built-in APIs to get files, one example is Socrata : http://dev.socrata.com/
  • some work is being done to scale processing of linked data in the cloud (did you know you could access ready available datasets in the Amazon cloud ? DBPedia for example )
  • the data stored in wikipedia can be a good source of vocabulary on certain machine learning tasks (and in the future, wikidata project)
  • there is an RDF extension to Google Refine (or OpenRefine), but we haven't managed to get it working out of the box,
  • WebSmatch uses morphological operators (erosion / dilation) to identify grids and zones in Excel Spreadsheets and then aligns column data on known reference values (e.g. country lists).

We naturally enjoyed the presentation made by Romain Wenz about http://data.bnf.fr with the unavoidable mention of Victor Hugo (and CubicWeb).

Thanks to the organizers of the conference and to the National French Library for hosting the event.

[carlchenet] Vrac de mini-messages n°4 : selfoss, Redis, Erlang, Projectlibre, Python, MySQL, Django

$
0
0
Suivez-moi aussi sur Identi.ca ou sur Twitter. Comme chaque semaine, voici les dents/tweets intéressants de la semaine dernière que j’ai publiés sur Identi.ca ou sur Twitter, revus et augmentés d’éventuels observations et commentaires mûris au cours de la semaine passée  Au menu selfoss, Redis, Erlang, Projectlibre, Python, MySQL et Django. lecteur rss auto-hébergé #PHP #PostgreSQL ou #MySQL ou #sqlite : selfoss (gplv3) http://selfoss.aditu.de/ => il est désormais en place pour […]

[cubicweb] We're going to PGDay France, the Postgresql Community conference

$
0
0

A few people of the CubicWeb team are going to attend the French PostgreSQL community conference in Nantes (France) on the 13th of june.

http://www.cubicweb.org/file/2932005/raw/hdr_left.png

We're excited to learn more about the following topics that are relevant to CubicWeb's development and features :

https://www.pgday.fr/_media/pgfr2.png

Obviously we'll pay attention to all the talks during the day. If you're attending, we hope to see you there.

Viewing all 3409 articles
Browse latest View live