Saving a failing software project is like working in a field hospital. You may know how to do it the right way but do not have time. There are two choices:
- doing it quick and dirty;
- starting over, cutting down, and still doing it right.
Quick and dirty is like giving up on antiseptics. The patient will die anyway, flesh rotting from infection. But you “did everything you could”.
Starting over is like giving up on anesthesia. It’s hard and unpleasant. But the software project has a chance to get back on track.
It’s not a choice how to save the project. It’s a choice whether to save the project or to cover your neck (neck is an euphemism).
Writing unit tests is like washing hands after going to the toilets. If you don’t, you both get bugs and worms, and also put others in danger. The only remedy is to stay away from you, as far as possible.
http://anglican.ml/, the proper domain for the Anglican way of machine learning. Also http://probprog.ml/.
There are only two software development paradigms: test-driven development and bug-driven development.
- Test-driven development results in programs which work well.
- Bug-driven development results in programmers which work hard.
Kant said: there are two a priori intuitions — space and time. There are also categories, and “the number of the categories in each class is always the same, namely, three”, like unity-plurality-modality, or possibility-existence-necessity. It would be fun to have three a priori intuitions, but only two exist, sigh. Really though?
Kant probably did not realize: there is a third one — probability, to wit, certainty of our experience. Just like space, probability precedes any experience. Every object is uncertain as much as it is extended.
The three a priori intuitions are related — infinite and undirected space, infinite and directed time, finite and undirected probability. Physics knows of uncertainty principle, we are uncertain about relation of time and space: both time and space cannot be intuited with certainty. Probability is as basic and fundamental as time and space for our cognition.
Just like geometry deals with a priori intuition of space, and mathematical analysis — with intuition of time, theory of probability deals with intuition of probability. There is philosophical justification for studying uncertainty, probability, and bayesian inference.
Imagine that you have a great idea. You write it down on a napkin, show to your colleagues, they photograph the napkin with their smartphones, and will get back to you with investment proposals.
Now, what if instead of a napkin one of your colleagues has a laptop or a tablet handy? (more…)
- a client on an old tablet or laptop in your kitchen, (sitting on the fridge and also holding a recipe book),
- and a server serving a web page with shopping check list, automatically updated, to a mobile app.
Every time you run out of something (eggs, sugar, tea, …), you add this thing to the list of ‘missing’ goods (lookup/predictive input make adding easier). When you go shopping, whatever you added is in the shopping list, when you buy, you cross out the entry.
A background knowledge module knows how to measure different things (sugar in kg or packets, eggs are counted, etc.), and suggests default amounts to buy. If you have to buy too often, the amount is automatically increased.
Imagine: a web app that sits on a collection of ebooks, shows the user a paragraph from a book, and asks the user whether they want
- get (buy) the whole book to read;
- read another paragraph from this book;
- read a paragraph from a similar book;
- read a paragraph from a different book.
The app can remember user’s past history to adjust suggestions. How paragraphs, similar, and different books are chosen is an interesting question.
For testing/development, free text repositories are available, for example, Project Gutenberg, but also many others.
Paper, slides, and poster as presented at SOCS 2015.
We introduce an approximate search algorithm for fast maximum a posteriori probability estimation in probabilistic programs, which we call Bayesian ascent Monte Carlo (BaMC). (more…)
An early workshop paper, superseded by current research but still relevant, slides, and a poster.
We introduce a new approach to solving path-finding problems under uncertainty by representing them as probabilistic models and applying domain-independent inference algorithms to the models. (more…)