Arnaud’s Open blog

Opinions on open source and standards

Box Wine Rules!

For once I’ll post on something that has nothing to do with my work. It’s also not that new news but I think I feel like adding my piece on this.

When my wife and I were on vacation last year we were given a box of wine – and I don’t mean a box of bottles of wine, but literally a box of wine, as in “box wine”. :-)

It’s fair to say that we were pretty skeptical at first but we ended up agreeing that it was actually quite nice. Based on that experience we decided to try a few box wines back in California. We tried the Bota Box first and stuck with it for a while but we eventually grew tired of it and switched to Black Box which seemed significantly better. This has become our table wine. We’ve compared the Cabernet against the Shiraz and the Merlot and the Shiraz won everyone’s vote, although I like to change every now and then.

Last weekend I had some friends over for lunch and ended up with a bottle that was barely started. On my own for the week it took me several days to get to the bottom of it. All along I kept the bottle on the kitchen counter with just the cork on.

At the rate of about a glass a day I noticed the quality was clearly declining from one day to the next. Tonight, as I finally reached the bottom of the bottle I drank the last of it without much enjoyment. This is when I decided to get a bit more from the Black Box wine that had now been sitting on the counter for about two weeks.

Well, box wine is known to be good in that it stays better longer because the bag it’s stored in, within the box, deflates as you poor out wine without letting any air in. As a result the wine doesn’t deteriorate as fast.

If there was any doubt, tonight’s experience cleared it out for me. Although the wine from the bottle was initially of better quality than any box wine I’ve tried to date, after a week, it was not anywhere near as good.

I’ve actually found box wine to have several advantages. It’s said to be less environmentally unfriendly. (They say more environment friendly but with its plastic bag and valve it doesn’t quite measure up against drinking water. ;-) Because it can last much longer than an open bottle of wine you can have different ones “open” at the same time and enjoy some variety.

So, all I can say is: More power to box wine! :-)

[Additional Note: This is not to say that one should give up on bottled wine of course. The better wines only come in bottles and I'll still get those for the "special" days. But for table wine you use on a day to day basis box wine rules.]

October 3, 2013 Posted by | standards | | Leave a comment

Linked Data Platform Update

Since the launch of the W3C Linked Data Platform (LDP) WG in June last year, the WG has made a lot of progress.

It took a while for WG members to get to understand each other and sometimes it still feels like we don’t! But that’s what it takes to make a standard. You need to get people with very different backgrounds and expectations to come around and somehow find a happy medium that works for everyone.

One of the most difficult issues we had to deal with had to do with containers, their relationship to member resources and what to expect from the server when a container gets deleted. After investigating various possible paths the WG finally settled on a simple design that is probably not going to make everyone happy but that all WG members can live with. One reason for this is that it can possibly be grown into something more complex later on if we really want to. In some ways, we went full circle on that issue but in the process we have all gained a much greater understanding of what’s in the spec and why it is there so, this was by no means a useless exercise.

Per our charter, we’re to produce the Last Call specification this month. This is when the WG thinks it’s done – all issues are closed – and external parties are invited to comment on the spec (not to say that comments aren’t welcome all the time). I’m sorry to say that this isn’t going to happen this month. We just have too many issues still left open and the draft still has to incorporate some of the decisions that were made. This will need to be reviewed and may generate more issues. However, the WG is planning to meet face to face in June to tackle the remaining issues. If everything goes to plan this should allow us to produce our Last Call document by the end of June.

Anyone familiar with the standards development arena knows that one month behind is basically “on time”. :-)

In the meantime, next week I will be at the WWW2013 conference where I will be presenting on LDP. It’s a good opportunity for people to come and learn about what’s in the spec if you don’t know yet! If you can’t make it to Rio, you’ll have another chance at the SemTech conf in June where I will be presenting on LDP as well. Jennifer Zaino from SemanticWeb.com wrote a nice piece based on an interview I gave her.

May 6, 2013 Posted by | linkeddata, standards | Leave a comment

More on Linked Data and IBM

For those technically inclined, you can learn more about IBM’s interest in Linked Data as an application integration model and the kind of standard we’d like the W3C Linked Data Platform WG to develop by reading a paper I presented earlier this year at the WWW2012 Linked Data workshop titled: “Using read/write Linked Data for Application Integration — Towards a Linked Data Basic Profile”.

Here is the abstract:

Linked Data, as defined by Tim Berners-Lee’s 4 rules [1], has
enjoyed considerable well-publicized success as a technology for
publishing data in the World Wide Web [2]. The Rational group in
IBM has for several years been employing a read/write usage of
Linked Data as an architectural style for integrating a suite of
applications, and we have shipped commercial products using this
technology. We have found that this read/write usage of Linked
Data has helped us solve several perennial problems that we had
been unable to successfully solve with other application
integration architectural styles that we have explored in the past.
The applications we have integrated in IBM are primarily in the
domains of Application Lifecycle Management (ALM) and
Integration System Management (ISM), but we believe that our
experiences using read/write Linked Data to solve application
integration problems could be broadly relevant and applicable
within the IT industry.
This paper explains why Linked Data, which builds on the
existing World Wide Web infrastructure, presents some unique
characteristics, such as being distributed and scalable, that may
allow the industry to succeed where other application integration
approaches have failed. It discusses lessons we have learned along
the way and some of the challenges we have been facing in using
Linked Data to integrate enterprise applications.
Finally, we discuss several areas that could benefit from
additional standard work and discuss several commonly
applicable usage patterns along with proposals on how to address
them using the existing W3C standards in the form of a Linked
Data Basic Profile. This includes techniques applicable to clients
and servers that read and write linked data, a type of container
that allows new resources to be created using HTTP POST and
existing resources to be found using HTTP GET (analogous to
things like Atom Publishing Protocol (APP) [3]).

The full article can be found as a PDF file: Using read/write Linked Data for Application Integration — Towards a Linked Data Basic Profile

September 11, 2012 Posted by | linkeddata, standards | , , , | Leave a comment

Linked Data

Several months ago I edited my “About” text on this blog to add that: “After several years focusing on strategic and policy issues related to open source and standards, including in the emerging markets, I am back to more technical work.”

One of the projects that I have been working on in this context is Linked Data.

It all started over a year ago when I learned from the IBM Rational team that Linked Data was the foundation of Open Services for Lifecycle Collaboration Lifecycle (OSLC) which Rational uses as their platform for application integration. The Rational team was very pleased with the direction they were on but reported challenges in using Linked Data. They were looking for help in addressing these.

Fundamentally, the crux of the challenges they faced came down to a lack of formal definition of Linked Data. There is plenty of documentation out there on Linked Data but not everyone has the same vision or definition. The W3C has a growing collection of standards related to the Semantic Web but not everyone agrees on how they should be used and combined, and which one applies to Linked Data.

The problem with how things stand isn’t so much that there isn’t a way to do something. The problem is rather that, more often than not, there are too many ways. This means users have to make choices all the time. This makes starting to use Linked Data difficult for beginners and it hinders interoperability because different users make different choices.

I organized a teleconference with the W3C Team in which we explained what IBM Rational was doing with Linked Data and the challenges they were facing. The W3C team was very receptive to what we had to say and offered to organize a workshop to discuss our issues and see who else would be interested.

The Linked Enterprise Data Patterns Workshop took place on December 6 and 7, 2011 and was well attended. After a day and a half of presentations and discussions the participants found themselves largely agreeing and unanimously concluded that: the W3C should create a Working Group to create a Recommendation that formally defines a “Linked Data Platform”.

The workshop was followed by a submission by IBM and others of the Linked Data Basic Profile and the launch by W3C of the Linked Data Platform (LDP) Working Group (WG) which I co-chair.

You can learn more about this effort and IBM’s position by reading the “IBM on the Linked Data Platform” interview the W3C posted on their website and reading the “IBM lends support for Linked Data standards through W3C group” article I published on the Rational blog.

On a personal level, I’ve known about the W3C Semantic Web activities since my days as a W3C Team Member but I had never had the opportunity to work in this space before so I’m very excited about this project. I’m also happy to be involved again with the W3C where I still count many friends. :-)

I will try to post updates on this blog as the WG makes progress.

September 10, 2012 Posted by | ibm, linkeddata, standards | , , , | Leave a comment

LibreOffice should declare victory and rejoin OpenOffice

When OpenOffice went to the Apache Software Foundation I started writing a post about this topic that I never got to finish and publish.

The post from my colleague Rob Weir on Ending the Symphony Fork prompted me to post this now though.

I should say that I no longer have anything to do with what IBM does with ODF and anything related. I’ve changed position within IBM in the Fall of 2010 and now focuses on other things such as Linked Data which I may talk about in some other post.

In fact, I’m now so out of touch with the folks involved with ODF that I only learned about OpenOffice going to the Apache Software Foundation when the news hit the wire. I had no knowledge of what was going on and have no insights as to what led to it. Similarly, I only learned after the fact about IBM deciding to merge Symphony with OpenOffice.

So, if anyone wants to blame me for speaking as a person on IBM payroll, I’m not even going to bother responding. This is entirely my personal opinion and I’ve not even talked about it to anyone within IBM.

But let me say quite bluntly what Rob is only hinting at: It’s time for LibreOffice to rejoin OpenOffice.

LibreOffice started from a sizable portion of the OpenOffice.org community being tired of Oracle’s control and apparent lack of interest in making it a more open community. I certainly understand that. But now that this problem is solved, what does anyone have to gain from keeping the fork alive?? Seriously.

While forks in the open source world can be a tremendous way of shaking things up, they can also be very damaging. In this case, I think it’s a waste of resources and energy to keep this going. Instead of competing with each other the LibreOffice and OpenOffice communities should get together to fight their common and real competitor.

I know a certain level of competition can be healthy but I’m tired of seeing open source communities fight with each other to their own loss.

I know the fork was painful and people still hold a lot of angst against one another but they need to get over that. They need to realize they would do themselves and everyone else a real service by putting all this behind them and uniting. LibreOffice should declare victory and join forces!

February 3, 2012 Posted by | opensource, standards | , , , | 32 Comments

A little trick to make your presentation document smaller

Presentation documents have become an essential part of our work life and our mailboxes are typically filled and even clogged with messages containing presentations. Some of these attachments are unnecessarily big and there is something very simple you can do to make them smaller. Here is what I found.

I was working on a presentation which had already gone through the hands of a couple of my colleagues when I noticed it contained 40 master slides, or templates, while only 3 of them were used and most of the others were just duplications.

From what I understand this typically happens when you paste in slides copied from other presentations. Usually what happens then is the software paste the slide in along with its template to ensure it remains unchanged. Given how common it is to develop presentations by pulling slides from different sources this can quickly lead to a situation with many templates, not always different.

I went on to delete all the useless copies I had in my document and saved the file to discover that its size had gone from 1.6MB to a mere 800KB. Even though disk space and bandwidth is getting cheaper every day I think anybody will agree that a 50% size reduction remains significant!

So, here, you have it. Tip of the day: To make your presentation file smaller and avoid clogging your colleagues mailbox unnecessarily, check the master view of your presentation and consolidate your templates.

Of course the actual size reduction you’ll get depends on your file. In this case the presentation contains about 20 slides, with only 3 slides including graphics.

For what it’s worth I experimented both with ODF and PPT using Lotus Symphony, as well as with PPT using Microsoft Powerpoint 2003 and the results were similar in all cases.

March 1, 2011 Posted by | standards | 1 Comment

The cost of wifi and wifi security (continued)

Just to clarify my post on the cost of wifi and wifi security, I’m not advising anyone to turn security off just because it may significantly be slowing their connection. For one thing, I didn’t.

Just like the old saying “safety first” goes, “security first” ought to prevail here. Indeed, there are several reasons for which you should bare the cost of security and keep it on no matter what.

If you need more speed, like I do for my media center, the solution for now is to use a cable and avoid wifi altogether. For a media center it’s not so bad given that I don’t really need to move it around, it’s just that there already are so many cables, one fewer would have been nice…

In the future, the upcoming 802.11n wifi standard should alleviate that problem by providing faster speed all around. You can actually already get products that support draft versions of the spec but I prefer waiting for the technology to stabilize.

The intent of my post was merely to highlight something that doesn’t seem to be getting much coverage on the web and which I think people should be aware of.

Also, I should note that both devices – the router and your computer – play a role in this. So, the loss in speed doesn’t necessarily only come from the router. The wifi card in your computer may be just as guilty. To figure this out you’d have to make tests with different computers, which I haven’t done (yet).

November 29, 2010 Posted by | standards | , , , | Leave a comment

The cost of wifi and wifi security

I break a long silence to write about something I just found out about wifi and wifi security. Admittedly it may not be an earthshaking discovery but having searched for info on the subject it doesn’t seem like it is covered much so it seems worth a blog post (plus, for once, I can give this higher priority than everything else.)

There is a lot of info out there on how to set up your home wifi and set it up to be secured. However, little is said about what this will cost you. I mean in loss of speed.

I did some tests over the weekend and here is what I found, using speedtest.net, with a cable internet connection:

Directly connected to the cable modem (no router, no wifi): ~23Mbps download

Connected via cable through my router (“Belkin Wireless G Plus Router”), no wifi: ~17Mbps download. Gasp, that’s a 25% loss right there. I’m no network expert so I don’t know if that’s normal but I sure didn’t expect to lose that much just going through the router. But that’s actually nothing. Read on.

Connected via wifi through my router, with an open connection, no security: ~14Mbps download. Ouch. Here goes another 18%. Unfortunately that’s not even close to be the end of it.

Connected via wifi through my router, with security set to WPA-PSK TKIP: ~8Mbps download. Wow! That’s yet another 42% loss just for turning the security on, which every website out there says you MUST do.

The loss due to the security setting motivated me to run tests against the various security options my router supports. It turns out that all WPA options and WEP 128bits basically lead to the same poor results.

Setting security to WEP 64bits is the only security option that doesn’t severely impact performance: ~13Mbps.

Sad state of affair!

WEP is known to be very weak and easy to break in minutes by a knowledgeable hacker. 64bits is that much faster to break than 128bits obviously.

So here you have it. The choice is between fast and unsecured or secured and slow. Stuck between a rock and a hard place.

Obviously results will vary depending on the router you use but, here is the rub: when shopping for routers I find very little/no info on the impact of turning security on. Most products claims are typically in optimal circumstances, as in “up to xxx”. and relative, as in “10x faster than xxx). This is of no help determining what performance you will actually get.

One thing that plays a role in the performance you get is the CPU your router is equipped with. Yet, from what I’ve seen, this is not a piece of information that is readily available.

To make matters worse, from what I’ve seen, websites such as CNET don’t highlight that aspect either. So, you’re pretty much on your own to figure it out.

Beware. Run some tests and see for yourself what you get.

November 28, 2010 Posted by | standards | , , , , | 9 Comments

The Facebook Oxymoron

Social networking tools a la Facebook and Twitter associated with the always connected feature of today’s hand-held devices have led to the creation of a new type of oxymoron that never ceases to amaze me.

Postings such as “enjoying a great dinner with my spouse”, “having a great time with visiting friends”, “I’ve got so much work to do!” just don’t make sense to me. I say they are a new form of oxymoron.

Here is the thing in case you didn’t get it yet: If you really were having a great time with your spouse or friends, you wouldn’t be posting about it. Especially since while social networking tools are great to connect with remote friends there is nothing more anti-social than taking time off from interacting with people around you to post on Facebook or Tweeter. In my book this is just rude.

And if you really were that busy, you wouldn’t have time to post about it, would you?

Seriously. I do use Facebook and I enjoy it. But when I’m on vacation for instance, the last thing I think about is posting about it. I’m happy to do so afterward.

This is so obvious to me that it amazes me to see how many people seem to fall into that kind of habit. I don’t understand this.

Of course I know what people are going to say: “Come on, it doesn’t even take a minute to post something like that”. I know. And I’ll accept that even though I bet in most cases people don’t just post but also check what others posted so it takes them more than just a minute.

But still, I want to ask: what’s the point of these postings?

Are we turning into an ever more egocentric society where under the pretense of caring for what’s going on in other people’s life we focus so much on our own that we feel an urge to tell people about everything we do?

I hope it isn’t so and this is merely excess usage that will somewhat fade away over time and we will eventually find a happy medium.

April 9, 2010 Posted by | Uncategorized | , , | 2 Comments

OOXML: And the light came on

I’ve thought about posting here several times over the last many months and I even have several drafts that never saw the light but this just keeps getting pushed down too far on my priority list to happen. However, I have to react to the buzz I’m discovering on my return from a week off.

Indeed, it is with quite a bit of astonishment that I read about Alex Brown’s frustration over Microsoft lack of interest in implementing ISO/IEC 29500 (OOXML). In the burgeoning comment sections following his post, Alex writes:

@Mr Allison

> The outcome that many had predicted, yet you
> insisted would not occur

Oh? I don’t recall making predictions about Microsoft’s behaviour? URL please!

Well, let me give you a link to a prediction I made! In my post What Microsoft’s track record tells us about OOXML’s future of March 25, 2008 I wrote:

They can, and I predict will, ignore all these additions which are optional and stick to what they have. The only reason they were added was to remove reasons for National Bodies to vote against OOXML.

So, here we are. Two years later, Microsoft has done exactly that and Alex Brown is finally seeing the light.

One can only hope that the standards community will have at least learned a lesson from this sad story: you simply cannot take control away from a vendor who has a monopoly and isn’t willing to give it up through a mere standardization process.

April 2, 2010 Posted by | standards | , | 5 Comments

Follow

Get every new post delivered to your Inbox.