Microloans at Internet-scale
Posted on 13 April 2007 00:20 | Permalink
Not too long ago, my wife and I watched a documentary on BYU TV called "Small Fortunes." From the documentary's website:
Millions of the world's poorest--mostly women--who are unable to provide the necessary collateral to secure a traditional loan are turning to microcredit institutions for help. These institutions give "micro" loans, often for less than $100, to those for whom the entrepreneurial spirit is still in its purest, most basic form. Whether it's through milking a buffalo, selling tortillas, or weaving cloth, most borrowers are able to pay back their loans--and have enough profits to reinvest in their businesses, their homes, and their children.
I was inspired to see the amazing changes to people's lives that were brought about by these small loans. The effect is generational--as one generation pulls itself out of poverty by way of these loans, the next generation becomes able to acquire education, better work, and overall a much better way of life. Many of these people live hand-to-mouth, with 3 meals a day a rarity. After watching the program, I wanted to find some way to get involved in the microloan movement.
The other day while browsing ConnectBlogs I read a post by Richard Miller about kiva.org, which allows anybody to get involved in these kinds of loans. Kiva.org has 'profiles' of a large number of entrepeneurs seeking microloans in poorer parts of the world. You can browse through all these profiles and choose to donate $25 or more to fund any of these microloans. 100% of what you lend is given to the entrepeneur. Once you've made a loan, you can receive updates of the repayment progress, and of the business you, along with others, have funded. Once the loan is repaid (and kiva.org reports 100% repayment thus far) you can choose to withdraw your funds, or reinvest them in another loan.
Richard Miller calls this "long tail philanthropy", an apt description. I remember reading a blog entry about a guy who maintained some Open-Source software, and whose hard drive (containing some important source code) had crashed. He blogged about it and in no time he had people who used the software quickly contributing enough to get him back up and running, and to take his drive to a data-recovery place to get his bits back. There is great power that can be harnessed in the masses of the Internet--and which can be harnessed for so much more good than meeting-up via the latest social networking site. A tiny amount multiplied by millions goes a long way. Long tail philanthropy is one way in which those with more can each give a little to make a large difference in the lives of those with less.
Just browsing kiva.org makes me feel good--seeing the faces of those who are being helped--reading their stories--seeing the faces of the hundreds of nice folks who are helping out by making loans. This is Good Stuff(tm)! It allows people to retain their dignity, to "make their bread by the sweat of their brow," so to speak. Instead of perpetuating dependence and poverty by handouts, this kind of system fosters self-reliance and industry, smart thinking and hard work. Both those who give and those who receive benefit and are better off.
I think for me, the most powerful aspect of all this is that I can, with a few clicks, give a little of what I have, and make a significant difference in the life of someone in need halfway around the world. I wonder what other ways there are in which we can harness this kind of Internet-scale power in helping to feed the hungry, clothe the naked, and help the sick and afflicted?
Reader comments: 3
Podcasting content created by someone else
Posted on 05 April 2007 14:15 | Permalink
What do you do when there is great, freely-downloadable media content out there just begging to be podcasted, but the entity creating the content doesn't create any podcasts? Well, you do like Dave Smith and podcast it yourself.
Dave has created a podcast feed, in which the enclosure links point to media files served up from the LDS Church's website for the recent General Conferences.
This makes me wonder what other kind of unique podcasts could be created by "mixing" links to various media files. There are a number of free podcast feed generation tools out there into which you could paste links to your favorite media. If memory serves, a while back on the LDSOSS mailing list it was suggested that the church generate podcasts of various content such as General Conference. The idea of timed-release of interesting content is powerful. By doing as Dave Smith has done, you could easily generate a Podcast of, for example, a daily dose of the hymns, or a daily scripture, or a weekly reading of the lesson in Teachings of the Presidents of the church, etc., etc.
From a tech/geek/coolness perspective, this is neat stuff, but what about copyright issues you might run into here? Is this akin to using img links to show other people's copyrighted photos on my own site (which I wouldn't feel comfortable with)? Would this count as 'distribution', even though the files are being served from the church's website? Or is this just adding a set of links to my site to the various mp3 files hosted on the church's server (with which I am comfortable)? Perhaps the key point is in the feed to be explicit about where the content is coming from, and who owns it.
Reader comments: 2
Silver Lining thought: Parking at work
Posted on 28 March 2007 16:34 | Permalink
A silver-lining thought:
Recently at work, finding a parking space has me further out from the building than usual. Inconvenient, yes (boo-hoo for me), but not a bad thing really. I remember the desparate days of the Internet bust, the parking lots around the building were very sparse. I could almost always park in the row right next to the building. Slowly the parking spots are filling up, and finding a space right next to the building is bceoming rare. Which thing gives me hope that the local tech economy is booming. I also get a few more steps away from my generally sedentary work-lifestyle. :-)
There is usually a bright side to many of life's little inconveniences.
Reader comments: 2
Emerging technologies for system administrators
Posted on 22 March 2007 18:47 | Permalink
If you're a system administrator you're probably aware of the challenge of maintaining the configuration on a number of machines. You're also probably aware of the challenge of restoring a machine to a known configuration after a crash or hardware failure. If you manage hundreds or even 1000s of machines, then you're probably keenly aware of keeping every one of those machines in sync as far as configuration goes.
If you manage a large number of machines you may use a tool like Cfengine. At $work we use it and it has served our needs fairly well, although with its problems at times. There are other such configuration management tools available. One that I have been following with some interest is Puppet, which aims to be a "next-generation" cfengine. Having looked at the two I have to say I like puppet's configuration syntax a little better, and overall it feels "cleaner."
Today I came across a couple of intriguing tools that could be stacked on top of puppet. One is Cft, Configuration file tracker, which, in the spirit of the Unix script command, allows you to begin a cft 'session', issue a number of system administration commands, end the session, and then spit out the puppet configuration files (known as manifests) to duplicate whatever changes you made in the cft session. So you install the OS on a machine, run Cft, configure to taste, take the resulting manifests, and easily replicate that configuration on a 1000 more machines.
Another tool comes from the same idea behind the various package managers like Redhat's rpm and Debian's yum. PRM, the Puppet Recipe Manager, "facilitates the distribution of 'recipes', canned system configurations. A recipe contains the answer to a specific configuration question like 'How do I setup a forward-only postfix server' in a form that can be easily fed into Puppet." This takes the idea of yum and apt-get, and extends them not just to packages, but to entire system configurations.
I can imagine over time, as the public collection of configuration recipes grows, that it will be come possible to extend this one level further, to that of configuring entire clusters of machines. yum install web-farm. yum install 3-tier e-commerce architecture. My job keeps getting easier and easier :-).
Update: One more cool tool out of et.redhat.com, Virt-factory. The website describes it better than I can here. I'm very interested because at $work a few years ago, I developed a database-backed web-app to maintain information about our machines, which could spit out kickstart files for any of our machines on demand. This has made it easy to provision and re-provision new systems as needed. Virt-factory looks to take that kind of idea into the virtualization realm. This is a declarative model applied to system administration. Essentially, I define the 'what', and these tools worry about the 'how'.
One other nice thing about Virt-factory is that it will provide an XMLRPC API as well as a Web GUI for interacting with it, something I always wanted to do with my system, but never got around to doing. One more lego for my playground.
Reader comments: 0
Amazon S3 storage engine for MySQL, part II
Posted on 13 March 2007 18:38 | Permalink
In true LazyWeb fashion:
About a year ago, I blogged about the possibility of a MySQL storage engine which would utilize the Amazon Simple Storage Service (S3). Well folks, it looks like this year at the annual MySQL Conference, Mark Atwood will be presenting on A Storage Engine for Amazon S3.
Reader comments: 3
Total lunar eclipse on Sat, Mar 3, 2007
Posted on 02 March 2007 10:50 | Permalink
A total lunar eclipse will occur this Saturday, visible to much of the world. See the chart over at Shadow and Substance to get a good idea of when it will be visible in your neck of the woods. Here in Utah, totality will occur from around 3:45 pm to 5pm MST. Let's hope for clear skies.
More information can be found at this NASA page, as well as at spaceweather.com
Reader comments: 0
Are you ready for the 2007 Daylight Savings time changes?
Posted on 26 February 2007 17:05 | Permalink
As part of the Energy Policy Act of 2005, Daylight Savings Time begins earlier and ends later than usual this year (Mar 11th and Nov 4th). If you're a systems, network, or database administrator, it's worth your while to take a close look to make sure your systems are patched if necessary.
This change may potentially affect just about all the systems you manage, servers, databases, storage, routers, switches, PBXes, cell phones, and so on and so on.
On most RedHat/Fedora Linux systems you can tell if your system is patched by doing the following:
/usr/sbin/zdump -v /etc/localtime | grep 2007
If you see references to March 11th, and Nov 4th in the output, you're good to go. If not, you need to patch. Some Googling around will help you find what measures are necessary for other operating systems and software. For Fedora Core since at least FC2, you simply need to yum update your tzdata package, and then copy or link the right timezone file out of /usr/share/zoneinfo to /etc/localtime. For earlier RedHat, the timezone data was part of the glibc-common package, so updating that is a bit more tricky.
The Wikipedia article referenced above posits a good question, will all the effort and time needed to patch and upgrade systems offset any theoretical productivity gains that come as a result of this act?
Reader comments: 0
Book Review: Database In Depth
Posted on 21 February 2007 23:58 | Permalink
The bottom line here is if you do much of anything with databases, then just about anything you read by C. J. Date will be worth your while. Database In Depth is no exception.
Database In Depth
Author: C. J. Date
Summary: An excellent introduction to the relational model by one of the best thinkers in the field.
Review Date: 21 Feb, 2007
When I was early in my Computer Science degree I took a course in which we discussed database fundamentals. In that class we learned about things like tuples, relations, predicates, predicate logic and deductive proofs. All of these were involved with the relational model, but it wasn't until later when I discovered the writings of Fabian Pascal and C. J. Date that I began to really understand how the above concepts tied into the database systems I was using, such as Oracle, MySQL, and Postgresql. One of those aha moments came when I realized that the deductive proofs we had done in that class were essentially queries to a database system. I came to see how each row (tuple) in a database table (relation) represented a set of values for a predicate that the relation represented. Overall, a database, then, was the logical AND of all the facts represented by the tuples of each relation. Queries were simply deductive proofs which allowed one to derive new facts from existing facts in the database. Good stuff all around.
If you're lucky, you will have studied Date's venerable Introduction to database systems while taking a college course in databases. If not, then you're still lucky, becuase Date has condensed the fundamentals of the relational model into a very approachable and very practical book published by O'Reilly, Database In Depth
Database In Depth takes you through a tour of the key concepts of the relational model, starting with the very basics (types, tuples, relations and so forth), and takes you step-by-step into more formiddable territory (stuff like normalization, join dependencies, integrity constraints, relational algebra, and the like). Throughout the book, Date explains each concept in his characteristic clarity. Date knows this stuff through and through, and it shows.
You may be tempted to think like many others that theory and fundamentals are fine and dandy, but how practical are they in the real world? In my experience, they're crucial. By understanding the fundamentals and the theory behind the databases you work with, you can avoid costly design flaws that lead to poor data integrity. By understanding these concepts, you can design databases that you can trust absolutely to store and deliver accurate results. I've had to work with databases that weren't designed with these concepts in mind, and the difference is stark.
One warning, you wont be spoon-fed here. The material can be challenging, and Date expects you to use your brain. This isn't SQL For Dummies. The real advantage you will gain by reading a book like this is that you will understand the mathematical and logical reasoning behind practical design principles such as why, for example, it's important to normalize (and the pitfalls you can run into when you de-normalize), why nulls can potentially lead to bad logic, and why duplicate rows are a bad idea all around. You'll be able to understand the ways in which most of today's database systems fail to faithfully implement the relational model, and the consequences of those failures (and consequently how to design your databases well despite these shortcomings).
Unlike many computer books that become obsolete within a year or two of their publication, Database in Depth is among that narrow collection of computer books that remain useful and relevant for years. This is precisely because it remains grounded in theory and fundamentals, instead of being tied to specific brands and versions of software.
Overall Rating: 9/10
Reader comments: 0
Like Google Maps for the stars...
Posted on 21 February 2007 21:50 | Permalink
This is too cool:
There's an API for it as well.
Reader comments: 0
Upcoming family history conferences
Posted on 09 February 2007 12:37 | Permalink
In my previous post I mentioned the Family History Technology Workshop. This is a yearly 1-day conference that was originally designed to be a forum where BYU graduate students doing research in family-history related emerging technologies could present on their research. Each year there are usually a handful of students who present, and often a number of individuals from companies like MyFamily.com and folks from the Church Family History Department who provide thought-provoking presentations on the amazing things happening in this area.
Each year the conference begins with a very interesting keynote speaker (2005 was Ransom Love, head of the church FH department, and last year was Peter Norvig from Google). Each year also includes a panel session with several influential members of the community. There are usually a number of vendors and individuals providing demos of their products and ideas. More info on the scope of the conference can be found at the FHT Website. For me, the FHT is my 'Mecca event', I look forward to it all year long.
One nice thing about the FHT is that you get to hear all the presentations of the workshop. They're not stacked like other conferences where you have to pick and choose which sessions you'll attend, while always wondering what you're missing in the other rooms.
Perhaps the best aspect of the conference is the "hallway track", the opportunity to rub shoulders and share ideas with other folks who are interested in family history technologies. It's a great opportunity to put a "bug in the ear" of the folks who are building the tools we use for family history work.
At only $60, the conference is an incredible bargain. For the price you get a full day of excellent presentations, plus a delicious catered meal (lunch or dinner). Snacks are provided during morning and afternoon breaks. Free WiFi (which works great compared to other conferences I've been to) is provided for all conference attendees as well.
FHT is usually held on a Thursday, and if one day of family history geekliness weren't enough, the following Friday and Saturday are occupied by the BYU Computerized Genealogy Conference. Where the FHT workshop if targeted more towards "tool makers", the Computerized Genealogy Conference is targeted towards "tool users". As such it's a little bit toned down geekiness, but it's still a great conference if you're interested in family history and technology.
The Computerized Genealogy Conference is also a great bargain. For $120 you get the full 2-day conference, with loads of presenters and a large collection of vendors presenting their offerings. The cost includes a full syllabus of all the presentations of the conference, so if you miss a session, you at least have the notes from what you missed.
Last year's conference had a number of presenters from the Church's Family History department, who spoke in length about the new Family Search, the digitization of the images in the Granite Mountain Vault, and the new indexing program so it's a great place to be to learn more about what's coming down the pipe from the church.
This year's FHT will be held on March 15th, and the Computerized Genealogy Conference will be on March 16th and 17th. Hope to see you there!
Reader comments: 0
<- Prev 10 | Next 10 ->