This is NOT a specific network, like Bitcoin, dedicated to financial transactions. It’s really a framework for people to run their own network(s) adapted to their specific application. The idea is that there isn’t going to be just one big blockchain network but rather many many such networks dedicated to various applications and involving a variety of players. For that reason the goal of the Hyperledger Project is to develop a very flexible framework that can be configured to meet the specific requirements of each application.
I actually knew very little about blockchain when I started and I still have a lot to learn but it’s been a lot of fun. My primary role is to help the project be successful as an Open Source project and guide the IBM development team in its transition from working on an internal project to working on an Open Source project. As time permits I also contribute and help with the development itself.
It’d been years since I did any production programming and it’s been a good opportunity to get a much needed refresher, getting to actually use many of the various tools that one typically uses for development nowadays. The first two weeks were rather humbling. Pretty much every step of the way I was stumbling on a piece of technology I had not used before. But I’m happy to say that I learned and was eventually able to start contributing to the project.
Back at the beginning of the year I felt the time was right for me to start on a new project and blockchain seemed interesting. After 6 months on the Hyperledger Project I can only say I’m glad I made that choice! Continue reading
What just happened
This specification was previously a Candidate Recommendation (CR) and this represents a step back – sort of.
Why it happened
The reason for going back to Last Call, which is before Candidate Recommendation on the W3C Recommendation track, is primarily because we are lacking implementations of the IndirectContainer.
Candidate Recommendation is a stage when implementers are asked to go ahead and implement the spec now considered stable, and report on their implementations. To exit CR and move to Proposed Recommendation (PR), which is when the W3C membership is asked to endorsed the spec as a W3C Recommendation/standard, every feature in the spec has to have two independent implementations.
Unfortunately, in this case, although most of the spec has been implemented by at least two different implementations (see the implementation report) IndirectContainer has not. Until this happens, the spec is stuck in CR.
We do have one implementation and one member of the WG said he plans to implement it too. However, he couldn’t commit to a specific timeline.
So, rather than taking the chance of having the spec stuck in CR for an indefinite amount of time the WG decided to republish the spec as a Last Call draft, marking IndirectContainer has a “feature at risk“.
What it means
When the Last Call period review ends in 3 weeks (on 7 October 2014) either we will have a second implementation of IndirectContainer and the spec will move to PR as is (skipping CR because we will then have two implementations of everything), or we will move IndirectContainer to a separate spec that can stay in CR until there are two implementations of it and move the remaining of the LDP spec to PR (skipping CR because we already have two implementations).
I said earlier publishing the LDP spec as a Last Call was a “step back – sort of” because it’s really just a technicality. As explained above, this actually ensures that, either way, we will be able to move to PR (skipping CR) in 3 weeks.
Bonus: Augmented JSON-LD support
When we started 2 years ago, the only the serialization format for RDF that was standard was RDF/XML. Many people disliked this format, which is arguably responsible for the initial lack of adoption of RDF, so the WG decided to require that all LDP servers support Turtle as a default serialization format – Turtle was in the process of becoming a standard. The WG got praised for this move which, at the time, seemed quite progressive.
Yet, a year and a half later, during which we saw the standardization of JSON-LD, requiring Turtle while leaving out JSON-LD no longer appeared so “bleeding edge”. At the LDP WG Face to Face meeting in Spring, I suggested we encourage support for JSON-LD by adding it as a “SHOULD”. The WG agreed. Some WG members would have liked to make it a MUST but this would have required going back to Last Call and as for one, as chair of the WG responsible for keeping the WG on track to deliver a standard on time, didn’t think this was reasonable.
Fast forward to September, we now found ourselves having to republish our spec as a Last Call draft anyway (because of the IndirectContainer situation). We seized the opportunity to increase support for JSON-LD by requiring LDP servers to support it (making it a MUST).
I wish we had marked IndirectContainer as a feature at risk when we moved to Candidate Recommendation back in June. Already then we knew we might not have enough implementations of it to move to PR. If we had marked it as a feature at risk we could now just go to PR without it and without any further delay.
This is something be remembered: when in doubt, just mark things “at risk”. There is really not much downside to it and it’s a good safety valve to have.
It’s already confusing enough to me that “butter” in the US is actually salted and what should really be called “butter” is called “unsalted butter”.
It’s not like butter is naturally salted and you have to remove salt from it to make “unsalted butter”. Salt is added to butter to make it salted butter. So the current names defy logic.
But that’s not all. To make matters worse, Lucerne’s unsalted butter sticks come in a blue wrapper packaged in green cartons, whlie its salted butter (a.k.a. “butter”) sticks come in a red wrapper in blue cartons! Now, if that’s not madness, what is it??
Seriously, how hard is it to have matching colors??
Given the absurd but unrelenting resistance against abandoning the Imperial system in favor of the Metric system despite all the good reasons to do so, I don’t expect the US to adopt the right names for the butter but at least Lucerne could sort out its packaging and making it easier on us. 🙂
For once I’ll post on something that has nothing to do with my work. It’s also not that new news but I think I feel like adding my piece on this.
When my wife and I were on vacation last year we were given a box of wine – and I don’t mean a box of bottles of wine, but literally a box of wine, as in “box wine”. 🙂
It’s fair to say that we were pretty skeptical at first but we ended up agreeing that it was actually quite nice. Based on that experience we decided to try a few box wines back in California. We tried the Bota Box first and stuck with it for a while but we eventually grew tired of it and switched to Black Box which seemed significantly better. This has become our table wine. We’ve compared the Cabernet against the Shiraz and the Merlot and the Shiraz won everyone’s vote, although I like to change every now and then.
Last weekend I had some friends over for lunch and ended up with a bottle that was barely started. On my own for the week it took me several days to get to the bottom of it. All along I kept the bottle on the kitchen counter with just the cork on.
At the rate of about a glass a day I noticed the quality was clearly declining from one day to the next. Tonight, as I finally reached the bottom of the bottle I drank the last of it without much enjoyment. This is when I decided to get a bit more from the Black Box wine that had now been sitting on the counter for about two weeks.
Well, box wine is known to be good in that it stays better longer because the bag it’s stored in, within the box, deflates as you poor out wine without letting any air in. As a result the wine doesn’t deteriorate as fast.
If there was any doubt, tonight’s experience cleared it out for me. Although the wine from the bottle was initially of better quality than any box wine I’ve tried to date, after a week, it was not anywhere near as good.
I’ve actually found box wine to have several advantages. It’s said to be less environmentally unfriendly. (They say more environment friendly but with its plastic bag and valve it doesn’t quite measure up against drinking water. 😉 Because it can last much longer than an open bottle of wine you can have different ones “open” at the same time and enjoy some variety.
So, all I can say is: More power to box wine! 🙂
[Additional Note: This is not to say that one should give up on bottled wine of course. The better wines only come in bottles and I’ll still get those for the “special” days. But for table wine you use on a day to day basis box wine rules.]
When OpenOffice went to the Apache Software Foundation I started writing a post about this topic that I never got to finish and publish.
The post from my colleague Rob Weir on Ending the Symphony Fork prompted me to post this now though.
I should say that I no longer have anything to do with what IBM does with ODF and anything related. I’ve changed position within IBM in the Fall of 2010 and now focuses on other things such as Linked Data which I may talk about in some other post.
In fact, I’m now so out of touch with the folks involved with ODF that I only learned about OpenOffice going to the Apache Software Foundation when the news hit the wire. I had no knowledge of what was going on and have no insights as to what led to it. Similarly, I only learned after the fact about IBM deciding to merge Symphony with OpenOffice.
So, if anyone wants to blame me for speaking as a person on IBM payroll, I’m not even going to bother responding. This is entirely my personal opinion and I’ve not even talked about it to anyone within IBM.
But let me say quite bluntly what Rob is only hinting at: It’s time for LibreOffice to rejoin OpenOffice.
LibreOffice started from a sizable portion of the OpenOffice.org community being tired of Oracle’s control and apparent lack of interest in making it a more open community. I certainly understand that. But now that this problem is solved, what does anyone have to gain from keeping the fork alive?? Seriously.
While forks in the open source world can be a tremendous way of shaking things up, they can also be very damaging. In this case, I think it’s a waste of resources and energy to keep this going. Instead of competing with each other the LibreOffice and OpenOffice communities should get together to fight their common and real competitor.
I know a certain level of competition can be healthy but I’m tired of seeing open source communities fight with each other to their own loss.
I know the fork was painful and people still hold a lot of angst against one another but they need to get over that. They need to realize they would do themselves and everyone else a real service by putting all this behind them and uniting. LibreOffice should declare victory and join forces!
Presentation documents have become an essential part of our work life and our mailboxes are typically filled and even clogged with messages containing presentations. Some of these attachments are unnecessarily big and there is something very simple you can do to make them smaller. Here is what I found.
I was working on a presentation which had already gone through the hands of a couple of my colleagues when I noticed it contained 40 master slides, or templates, while only 3 of them were used and most of the others were just duplications.
From what I understand this typically happens when you paste in slides copied from other presentations. Usually what happens then is the software paste the slide in along with its template to ensure it remains unchanged. Given how common it is to develop presentations by pulling slides from different sources this can quickly lead to a situation with many templates, not always different.
I went on to delete all the useless copies I had in my document and saved the file to discover that its size had gone from 1.6MB to a mere 800KB. Even though disk space and bandwidth is getting cheaper every day I think anybody will agree that a 50% size reduction remains significant!
So, here, you have it. Tip of the day: To make your presentation file smaller and avoid clogging your colleagues mailbox unnecessarily, check the master view of your presentation and consolidate your templates.
Of course the actual size reduction you’ll get depends on your file. In this case the presentation contains about 20 slides, with only 3 slides including graphics.
For what it’s worth I experimented both with ODF and PPT using Lotus Symphony, as well as with PPT using Microsoft Powerpoint 2003 and the results were similar in all cases.
Just to clarify my post on the cost of wifi and wifi security, I’m not advising anyone to turn security off just because it may significantly be slowing their connection. For one thing, I didn’t.
Just like the old saying “safety first” goes, “security first” ought to prevail here. Indeed, there are several reasons for which you should bare the cost of security and keep it on no matter what.
If you need more speed, like I do for my media center, the solution for now is to use a cable and avoid wifi altogether. For a media center it’s not so bad given that I don’t really need to move it around, it’s just that there already are so many cables, one fewer would have been nice…
In the future, the upcoming 802.11n wifi standard should alleviate that problem by providing faster speed all around. You can actually already get products that support draft versions of the spec but I prefer waiting for the technology to stabilize.
The intent of my post was merely to highlight something that doesn’t seem to be getting much coverage on the web and which I think people should be aware of.
Also, I should note that both devices – the router and your computer – play a role in this. So, the loss in speed doesn’t necessarily only come from the router. The wifi card in your computer may be just as guilty. To figure this out you’d have to make tests with different computers, which I haven’t done (yet).