James Fallows, Kevin Drum and Ezra Klein, all reasonable men to be sure, are gun shy about Google. Fallows isn’t eager to use Google’s new notetaking system, Keep, because it has cancelled interesting software in the past (not just Reader, but Health and Notebook). Drum notes that client-based software (which stores data on your PC rather than the cloud) is still usable for a while even if the software company goes out of business, but Reader will be dead July 1 no matter what. Ezra is worried because he’s reached the 30 gigabyte paid limit for Gmail storage and can’t buy any more.
These are all legitimate concerns, but let’s get some perspective here. Even the Google duds run for a minimum of 4 years (Health, 4 years, Reader, 9 years, Notebook, 5 years). And Google has a serious, well-funded effort to make sure you can export all your data from Google services.
When you put your data into the cloud, you run a bunch of risks. The first, and worst, is that you can’t get your data back. Other risks include the service going down (not a Google habit), outgrowing the service, and the service changing in ways you don’t like. The first two risks are handled about as well as they can by Google. The last two are the way of all software: whether on the desktop or in the cloud, and Google changes very incrementally and has pretty generous limits (30 Gigs is a hell of a lot of mail, for example).
In other words, the free pony I get with Google is a pretty, pretty pony, and I’m sticking with it until another, better pony comes along, or until this pony is put down.
BTW, I haven’t even started looking for a Google Reader replacement because I’m letting the people who paniced and moved the first day try out all the alternatives and report back. I’ll post here when I find one.
Update: This seems to indicate that you can buy more than 30 GB of Gmail storage, a lot more. So, Ezra was wrong about the Iraq War, and Gmail. I expect a mea culpa sometime in 2023.
Xboxershorts
What 30GB limit? I have a 200GB google cloud storage that costs 50 bux a year. WTF is Ezra smoking and why isn’t he sharing it?
Belafon (formerly anonevent)
And this isn’t unique to the cloud. Disk failure, improper cd burn followed by rmdir -rf, etc.
Warren Terra
@Xboxershorts: I don’t have any inside knowledge either of Ezra’s intake or of Google’s services, but I suspect you’re failing to distinguish between storage rented to you by Google and Gmail capacity offered by Google. You may be able to store 200 gb of old mail archives for 50 bucks a year, but if Gmail can only access active files that are limited to 30 gb even with their paid service, those 200 gb of archives aren’t going to be ready to hand with your email client of choice.
NotMax
30GB of mail?
Digital Collyer brothers.
liberal
@Belafon (formerly anonevent):
If “rmdir -rf” is one of the biggest worries, then I’ll gladly keep my data on my own disk.
Roger Moore
I’d say that the single biggest risk is that your connectivity to the cloud will fail when you need it, so you won’t be able to get at your stuff. I would say that the best way of managing both that and the risk you mention is to have some kind of transparent local caching system. That way your files are still available if/when you lose your connection, and they’re more likely to be stored in a format that you can access if/when the provider gives up on it. Cloud storage is great, but only a fool would make the cloud their only copy of something important.
arguingwithsignposts
I’m sorry but Ezra Klein needs to man up and clean out his fucking inbox if he can’t handle 30GB of storage. WTF?
Also, I’m glad mistermix has done his shilling for Google, but what’s chapping me about this is it’s indicative of Google’s approach to “innovation.”
They can’t come up with their own ideas, so they use their considerable advantage to steal someone else’s bread and butter. Why would I use Keep if Evernote is already doing this stuff, and better, and it’s also free? People will start using Keep because it’s already included on Android and it’s going to be stapled into that top nav in Google products when you’re logged in. And yes, I know Apple and MSoft have been guilty of the same. Facebook and Twitter too.
When Google came out with Reader, it wasn’t the only player, but it bigfooted everything else because it was free and it was Google. Now they’re killing it, users be damned.
liberal
@Roger Moore:
Yep.
MattF
I still keep a local copy of my email, and backing it up happens every time I do my regular backup. For local storage, 30GB is completely negligible– you can’t buy a 30GB HD because it’s too small. Maybe I’ve just reached the ‘get off my lawn’ stage technically, but I’ve never understood why cloud storage is regarded as superior to keeping a copy close to home.
Roger Moore
@Belafon (formerly anonevent):
That’s
rm -rf
, damnit.rmdir
will fail on non-empty directories, and doesn’t have either-r
or-f
options (though it does have a-p
that’s kind of like-r
but in reverse).Roger Moore
@Belafon (formerly anonevent):
That’s
rm -rf
, damnit.rmdir
will fail on non-empty directories, and doesn’t have either-r
or-f
options (though it does have a-p
that’s kind of like-r
but in reverse).Roger Moore
@Belafon (formerly anonevent):
That’s
rm -rf
, damnit.rmdir
will fail on non-empty directories, and doesn’t have either-r
or-f
options (though it does have a-p
that’s kind of like-r
but in reverse).liberal
@arguingwithsignposts:
There’s nothing really wrong with people “stealing” each others ideas. The problem is, rather, as you mention—Google has extreme “market” power, if not true “monopoly” power.
Roger Moore
FYWP. (ETA: –with_a_very_rusty_pitchfork)
Prefect of the Congregation for Divine Worship and the Discipline of the Sacraments, Cardinal mistermix
@arguingwithsignposts: Keep at this moment is not any kind of a threat to Evernote (it has a fraction of the features and no native client except for Android), it is certainly not stealing “innovation” (there are hundreds of note apps in the world, Keep is one of them) and it is not included on Android automatically.
liberal
@MattF:
I don’t get it either, though I suppose if you have lots of devices there’s something to be said for it.
Then again, I don’t even have a smartphone, and told my nephew a week ago that if he really wants to learn to code, he should start with C.
Belafon (formerly anonevent)
@Roger Moore: Yeah, while I was doing some work it clicked that I didn’t get the command correct. Switching between windows (work) and Linux (home), I generally have to go looking up those type of commands. My bad.
Sayne
Fix your hair up nice, fix yourself up pretty, and meet me tonight in Atlantic City…
Belafon (formerly anonevent)
@liberal: I’m telling my kids at a minimum to learn Python, and I’m starting my middle one on Lisp.
liberal
@Belafon (formerly anonevent):
My other nephew mentioned Python. I brought up C just because I think it gives one a better understanding of how memory works, etc.
Lisp—you have any direct practical use? I occasionally read up on flame wars about this or that language, and (saying this as someone who isn’t truly a programmer) ISTM that functional programming is one of the big trends, which (absent hard empirical research showing increased productivity and code quality…maybe there is some?) I don’t really understand.
Roger Moore
@liberal:
It’s also supposed to have reliability advantages. Unless you’re unusually careful about backups, your local storage is a single point of failure. Cloud providers are supposed to keep multiple copies of your data, ideally at multiple locations. They deal with boring hardware issues like drive failures, backups, etc. You put your stuff on the cloud, and it’s there for you with high reliability and no need to worry about exactly how that happens. It’s probably a pretty good idea to do that with some of your critical documents (tax records, legal documents, etc.) that you really, really don’t want to lose.
MattF
@Belafon (formerly anonevent): Bingo. Although, truth to tell, scientific Python is looking more and more like a fork that’s stopped at Python 2.7.
liberal
…adding, not that I have anything against Python. At least it’s relatively pretty, compared to perl, which while pretty useful is just an ugly, ugly language amazingly full of idiosyncrasies.
Ronnie Pudding
Not too long ago, people used to get by without any of this shit.
liberal
@Roger Moore:
I more or less agree. While I don’t think it’s necessarily all that hard to have reliable local storage practices, unless one invests the time to set it up straight and maintain good (administrative) practices, it can get pretty goofy.
Odie Hugh Manatee
My cloud is my home network and that’s just fine with me.
liberal
@MattF:
How do you mean by that? (I’m writing as someone who’s trying to pick up some python these days.) That scientific python isn’t being carried over to python 3?
Ivan X
AFAICT, that link has the columns reversed. You can buy gigs of Google Drive storage, but email tops out at 25 GB. You that’s also what they indicate on their “new pricing plan” link on the same page where they make that marginally clearer.
This is my problem with Google — not that that they aren’t innovators (they most assuredly are), or offer a tremendous value for what you pay (they do), or kill technologies you’ve invested in prematurely (they do, but I can live with it). It’s that there’s such wild interface inconsistency and lack of logic everywhere.
The help articles don’t agree with each other and are sometimes wrong (as I believe is the case here), the mail settings sometimes require you to save at the bottom and sometimes don’t, you can’t select all contacts in a specific group (or all contacts, period), there’s no clear explanation as to how to set up an IMAP client correctly and how to expect it to behave, the link to the Google Apps dashboard is in the Gmail settings menu rather than the Account settings menu where it obviously belongs, if an Apps account’s payment expires and you try to log in from Gmail.com rather than the Apps login page, it says “you don’t have this service, contact the administrator” even when you are the administrator — as you have probably figured out, I could go on and on. I find them confounding, since they offer so much value and have such good ideas, but have such a seemingly random deployment of them. Oh, and I find them aesthetically horrendous even as I admire their technical sophistication. I curse Google many times a day even as I willingly use their products.
Roger Moore
@liberal:
It’s not just the administrative practices to have good backups. If you’re really hardcore about this stuff, you probably want non-local backups, at least of your most critical data, so you’re protected against a natural catastrophe that would destroy your whole site. This isn’t just something for business, either. My hard drive has a lot of valuable (to me) information on it that I’d really like to have, even if my house burns down and takes all my computer gear with it. I solve this by rotating backups and keeping a recent copy of my hard drive in my desk at work, but if I were starting from scratch I would at least consider cloud backups.
MattF
@liberal:
First, a general disclaimer: I am a scientist, not a software developer, and I do not have specific knowledge about internal Python community politics.
That said, the big integrator of scientific Python libraries is Enthought:
http://www.enthought.com
and, as far as I can tell from their public documentation, they’re not interested in moving to Python 3.x. Anyone who uses a current Enthought distribution gets Python 2.7 plus a very large number of proven-compatible libraries. If you want to move away from that, you’re on your own. Not much of a choice if you’ve actually got a job to do.
The broader picture is a divide between the ‘Python language’ and ‘scientific Python’ communities. Both sides have worked at reducing the differences, and everyone has good intentions, but Python language users in the real world pretty much have to make a choice about what version of Python they’re going to use for the foreseeable future.
crosspalms
It is a pretty pony, but it’s a pretty pony that lives in a coal-fired warehouse somewhere. Just like I’m typing this on a coal-fired (but I’m in Illinois, so mostly nuclear-powered) machine. Cloud’s a nice name for power-eating server farms.
liberal
@Roger Moore:
That kind of thing is precisely what I mean by administrative practices. “Burned two copies. Check. Verified both copies. Check. Deposited one copy [of home data] at work. Check.”
liberal
@MattF:
I downloaded and installed the free version of Enthought to follow along with Python for Data Analysis. But is it really that hard to go it alone and just install things on your own? (I’m asking, not flaming, and I realize that most of us don’t have time to tinker all day.)
liberal
@crosspalms:
I agree with the sentiment, but a hard-core analysis would have to compare the actual large-scale environmental cost of all this, versus other human activities.
Without having time to look into details, I’m more sympathetic to the idea of server farms than people driving SUVs in relatively temperate climates.
liberal
@Ivan X:
I have very little knowledge of the google playground, about 3 orders of magnitude less than you, but I do know their web gmail interface is hardly the pinacle of human/computer interaction quality, and their succession of interfaces for google groups isn’t as good as that of the predecessor (deja news, IIRC).
magurakurin
I don’t have any problem with Google. It’s the will of Landrew. I am one with the body.
seriously, just assimilate yourself with the Borg and be at peace…
or as the Wizard would say
Don’t worry so much! Relax killer, you’re gonna be all right.
cat
@Belafon (formerly anonevent):
Good thinking! Its easier to learn imperative programming after learning a lambda/functional programming language then the reverse.
@Odie Hugh Manatee: Computer Science, programming, and Software Engineering are still in the “The leaches aren’t making the patient better, maybe we need to add more leaches” phase. Most people, not people who have to code for a living, are better served by not learning C/C++ as they don’t have the skill and experience to code in C/C++ safely and effectively.
MattF
@liberal: As long as the things you are installing are all robust and orthogonal (forgive the jargon) it’s pretty easy.
But if, to take a horrible example, you want to try out OpenGL-under-Python, you need a windowing library, you need to deal with a maze of OpenGL system-specific bugs and dependencies, plus implementations of the various vector and matrix variables that the OpenGL API uses. I’m happy to let Enthought worry about that sort of thing, and I’m not happy to have to try do it myself.
liberal
@MattF:
Fair enough.
terraformer
Maybe I’m a technology luddite, but I think one of the primary “risks of the Cloud”, at least from my perspective, is security. I’m not too keen on trusting anything that is not on the system in front of me, and certainly not something that’s residing on a server somewhere, which is not owned by me, but someone else.
Walker
Anything that requires me to work in the cloud, as opposed to making cloud storage an option, is a no go for me. Too many places where connectivity just does not work (e.g. on airplanes, in New Hampshire, etc…).
File formats can be converted.
Liberty60
This thread makes think of Michael Lind’s article yesterday in Salon discussing rentiers, who profit passively while contributing little of value.
The single biggest problem I see with data storage is that it is being offered as a service, for rent and not ownership.
You don’t own your e-books, you don’t own access to data, and in some sense you don’t even own your own hardware and software; Within a couple years, your PC will be unable to operate with others, and your software will not be compatible with others.
The constant churn of “innovation” effectively means we are and will forever be paying rent on even the most basic ability to function in the industrialized society.
Working our jobs, paying our bills, communicating, reading…all these things will become the function of rents.
In the business world, when a property owner develops a plan for a building, they project an expected schedule of capital expenses and operating costs over the life of the building- how much they expect to spend yearly on repairs, how often the furnace and roof needs to be replaced and so on.
The goal is to make these things last as long as possible, to minimize the cost of ownership.
What if we did that for our own personal property? What if we forecast how much we expect to spend each year for a computer, or how often we expect to trade in our phone handset? What if we created an estimate of cost of owqnership of our cars, TVs and appliances?
I think this is the hidden cost of innovation- it reduces the effective life of our goods, and accelerates obsolecence to the point where nearly everything we use becomes disposable.
? Martin
NO!
When you put your data in the hands of a service which you pay nothing for, you run a bunch of risks. If you are not paying, then you are not the customer. In this case everyone who advertises through Google is the customer – they’re paying. And if they start paying less or demanding more (and they will work tirelessly to get both), then YOUR service goes away.
If you want YOUR data to survive in the cloud, then fucking pay for it. Period. Pay for Evernote, pay for Dropbox, pay for these things and they will survive for you because YOU are the customer.
Spike
This thread made my inner 7-year-old very happy that I have that cloud-to-butt browser extension installed.
Tractarian
I am really sick and tired of this hackneyed BS. I am the customer because I am the end user whose eyeballs the advertisers want to reach. Google’s free products do not need to appeal to advertisers; they have to appeal to me, the end user. Because without me, the advertisers disappear, and Google goes *poof*.
Right. As long as you pay for it, it’s guaranteed to last forever and be perfectly stable. Makes sense to me!
Pococurante
As for RSS readers I’m very happy with Newsblur and was more than happy to sign up at $2/month.
It’s a one man operation. I expect he’ll be bought out within a year.
RareSanity
There’s an article that I read yesterday, that I think is excellent at addressing the whole thing with Google and it eliminating services:
http://www.androidpolice.com/2013/03/21/editorial-just-because-google-closed-reader-doesnt-mean-you-can-never-trust-any-service-ever-again/
It’s an Android oriented site because changes that Google makes to services, disproportionately affect users of Android. However, the article is completely unrelated to mobile platforms.
You should read the whole thing, but basically, the author states that Google has been going about the process of treating its’ technology based ADHD, which is absolutely true. They are trimming the total number of services that are active, so that they can better focus on their “core” services, and present a more consistent interface between them.
I love, love, love Google Reader…and I’m unhappy that it is going away. However, as the author noted, there are alternatives, and now, there are multiple companies that are now competing for soon-to-be orphaned Google Reader users.
That’s right,the killing off of Reader, is allowing for actual competition in the RSS space that hasn’t existed since the introduction of Reader. I’ve decided on Feedly, but there are at least 3 other viable alternatives, with more to come I’m sure.
As far as cloud services, I think that people have gotten to the point of some real entitlement issues. For the people that only use the free version of GMail (I use it and have a paid Google Apps account), Google is given you a top-notch email service, with more storage than the average person will ever use for free. They are handling all of the back-end administration, paying for all the bandwidth, paying for all the servers for storage, filtering your email for scams and spam without anyone having to lift a finger for the service.
I was one of the original invite-only beta users back in 2004-05 and in the almost decade since I’ve been using it, I’ve never experienced an outage, lost emails, or any other screwy behavior that happens with disturbing regularity in other organizations and services.
I can still, to this day, pull up my original “Welcome to GMail” email from way back when, and I have not had the slightest problem, or worry of a problem with email since I first logged in.
For all the various “issues” people are complaining about, I invite you to experience the “joy” of attempting to administer your own email, from your own server. As many people here will attest, it is a difficult, frustrating, full-time job..one which also justifies many people getting paid good money to do.
RSR
Google bought my favorite home media server software/hardware platform, SageTV, and promptly disappeared it. It’s been MIA for two years, I think.
On acquisition day, without warning, they stopped selling both the hardware and software. The last generation of both were quite good–and I missed them. Grrrr.
RareSanity
@liberal:
You don’t know how much a favor you did your nephew. Learning ‘C’ is not just about learning to “program”, it’s about learning how computers work.
It should be pointed out that I don’t care what operating system is used, or what the programming language du jour is, at the end of the chain, it all becomes ‘C’.
All of the kernels, compilers, interpreters, virtual machines, frameworks, etc, etc…are ALL written in ‘C’. That includes Windows, OS X and Linux. The understanding, and ability to operate, from the top of the programming chain and the way to the bottom, is what separates the proverbial men from the boys in the realm of programming.
Whether you are programming in C#, Java, Python, Lisp, Erlang, PHP, or whatever…what do you do, when you have a problem that you absolutely know, has nothing to do with your code?
Are you able to follow the execution of a program down through the VM/interpreter/framework level, down to the kernel level, so you can figure out exactly what the hell is going on?
What if you have a deadline that won’t allow you to wait for whomever maintains these “interfaces” to decide they’ll fix, and release, a new version that addresses your problem?
I started out writing embedded code for cellular phones. Everything below the application layer (the layer everyone sees) of modern cell phones, are written in C/C++. The closer you get to the actual kernel, the more everything will be in ‘C’ only. It is the only language that will allow you to work directly with the hardware (CPU, RAM, storage, etc) of a computer with no other layer between you. So directly, you often still see assembly code embedded in kernel code, for extremely time sensitive operations like audio and graphics.
You tell your nephew, that if he knows ‘C’ in addition to whatever higher level language he may use, he will be light-years ahead of programmers under (about) 40 years old.
? Martin
@Tractarian:
How many ads to you view in Google Docs? None. In Reader? None. In Notebook? None. In Drive? None. These services only exists as a way to siphon off information about you to feed to advertisers. And if that information isn’t useful to sell more ads, or charge more for ads, then the service will go away – it’s a cost center. Who determines the utility of the services is entirely controlled by the advertiser. You have no way to communicate to Google HOW valuable these things are, because they are, by definition, valueless to you. You pay nothing. Whether they exist depends 100% on what kind of utility they deliver (which means ultimately, the value of the content you put in them) or the altruism of Google.
The services that actually deliver ads – search, adsense, and so on – they’re pretty much guaranteed to remain – and they’re the ones where your input matters, because those are the ones that actually deliver the revenue. The rest – I wouldn’t count on a single one of them until Google gives the option to put your cash in to keep it running.
I didn’t say that. But customers have control in this case. If the service isn’t profitable, you’ll have some clue – because they’ll try and raise prices. Or they’ll post earnings and you can tell that way. But they’re going to communicate with you in either direct or subtle ways.
Within Google, you have no way of knowing which services Google is inclined to keep and which ones they are inclined to kill off. As far as we can tell, its arbitrary. We can only speculate after the fact. Reader was extremely popular, but it apparently didn’t deliver as much value to the ad side of the house as Wallet does, which is much less popular. But the popularity and utility of these things are completely buried in Google’s financial statements, and nobody outside of the executives knows what the contribution of each service is to each dollar of ad revenue.
But do not be deluded into thinking that these services exist for your benefit. They don’t. Not until you are paying.
Xboxershorts
@liberal:
Indeed, I use the cloud for sharing, most of what I have in the Google cloud is backed up elsewhere and securely saved locally.
RareSanity
@RSR:
It’s not MIA…it’s now called Google TV.
Definitely not as “accessible” as SageTV was, but definitely prettier. :-)
Roger Moore
@liberal:
And for personal use, that’s probably about all there is to it, except for the need to find a safe location for your backups. (Not everyone has a convenient place like their desk at work to keep this stuff.) For a business, though, it probably means finding a company like Iron Mountain to provide the secure transport and storage of the backups at a remote location, which is far from free. Cloud storage handles all that stuff transparently, which seems like a huge win to me.
RareSanity
@? Martin:
I disagree. As a matter of fact, you outlined this earlier in your response:
Search, GMail, Maps and YouTube are never going anywhere…EVER.
Just about every other service they offer is to simply improve the ad serving functions of those “core” services. Those are the only services that Google actually serves ads on. They don’t even show ads on Google+…yet.
I’m getting worried about Google Voice going away because it basically exists, only to improve Google’s voice recognition for Search. If they ever decided that they’ve gotten all they can get from Google Voice, it will be marked for deletion as well.
liberal
@Roger Moore:
Funny you mention that…I think that’s the firm our sysadmin proposed using for all our backup tapes when he first got here. “What if the building catches on fire.”
Meh. Given the quality of our operation, I didn’t see why he just didn’t dump them in a closet in our other building. (The concept of sending it somewhere else is a good one, but we run a pretty shoestring operation here, and it seemed like overkill. Adding, cloud storage wouldn’t be a good fit for us economically because we have terabytes of data.)
liberal
@RareSanity:
With the caveat that I’m sure I know at least one order of magnitude less about this stuff than you do, that’s also one of my main motivations.
Maybe it’s not a perfect fit, but the C “memory model” is pretty reflective of what’s actually going on.
Roger Moore
@? Martin:
I don’t think that’s strictly true. A lot of them exist as loss leaders, both to pull you into the Google ecosystem in general and to get you to upgrade to higher levels of the same service. So an entrepreneur who is happy with their personal Gmail account will decide to outsource mail services to Google when he founds his company. Maybe he’ll decide to use Google Docs, Calendar, etc. instead of the Microsoft versions. Somebody who likes sharing pictures via Picasa will pay extra for more storage when he reaches 1GB. And so on.
liberal
@Tractarian:
Hmm…on a related note, somewhere (probably dead tree Wash Post) had an article about Firefox thinking of making it much harder for websites to track users. The article had all this whining from the usual suspects.
OK, I get that this ad tracking stuff isn’t necessarily Big Brother, and I get that it pays for a lot (ultimately), but why the f should a browser maker listen to anyone other than browser users?
liberal
@Liberty60:
Wow, that’s strange—he actually mentions landlords. Despite the obvious origin of the word “rent”, almost no one these days mentions landlords, not even someone as smart and liberal as Dean Baker.
liberal
@Liberty60:
I think you’re confusing the colloquial and economic meanings of the word “rent”.
RareSanity
@liberal:
It’s is directly reflective of how the memory model works.
As anyone that has mistakenly tried to de-reference a NULL pointer, or incorrectly managed a counter in a loop, there is no “safety net” in C when dealing with memory, unless you write it yourself. You can access every corner of memory from your code, up to and including the all important program counter. It is the pinnacle of programming flexibility.
However, that flexibility comes with a price. C is also completely unforgiving, and will carry even the smallest mistake, out to it’s eventual fatal (to the program) end.
Usually without any warning.
People will say that I have described is the epitome of why C is “bad”. But at some point, memory has to be manipulated directly. I say that if it has to happen, I’d rather write it myself, than trust that someone else (that I don’t know) is “taking care of it”.
I like to know where the bodies are buried…so to speak.
JustRuss
@? Martin:
I find your abundance of faith disturbing. Seriously, you don’t believe any paid-for services have gone belly-up, leaving customers in the lurch? If you want to argue that people shouldn’t bitch about free services, fine. But suggesting that paying for a service guarantees, well, anything, is laughable.
Jay in Oregon
@Tractarian:
Yeah, just ask the users of Sparrow, a Mac OS X mail client, how well that worked out. (Shorter: People paid for Sparrow to support the product and the developers, but Google bought the company and development halted anyway.)
? Martin
@liberal:
Every browser does this by default now that Firefox has jumped on – except Chrome. Safari started it and has done this since it was introduced. IE came next. Firefox came on recently.
And again, Chrome is a Google product. They listen to who pays for the development of the product, who again are the advertisers. Google just yanked all of the adblock apps from the Play store, while continuing to complain that Apple retains the authority to decide what goes in their app store.
I have nothing against Google, but I wish people would stop treating them as some altruistic agent for a better internet, when they could equally be labeled as the worlds shiniest direct marketer. I think that’s also overstating it, but I really prefer dealing with companies that tell me straight up what the cost to me is going to be, rather than promising me something cheap and then extracting their value without my knowledge. Does anyone really know what Google does after they read all of your email?
RareSanity
@Jay in Oregon:
On the bright side, Postbox is pretty awesome…and cheap!
The dark side is that it kind of sucks that Google took everything when it really just wanted the the mobile stuff. Truth be told, Google doesn’t want people using clients, especially desktop ones, to access Gmail. They can’t show ads in email clients.
They want you using a web browser on anything that doesn’t have a Google developed Gmail client. I think that’s why they bought Sparrow…to integrate the stuff that made it so popular into their own clients, so people will be less and less inclined to use third party clients.
liberal
@? Martin:
Completely agree.
I actually download my email from Google using POP3, but I’m not convinced that’s much help. Sadly, don’t want to deal with changing my email address now.
? Martin
@Jay in Oregon:
That’s actually a good example. Sparrow was too cheap to pay for it’s own development. The developers knew that from day one, but the users haven’t yet learned how to value that. It was still effectively free. Honestly, an app you use every day isn’t even worth a penny a day? Users need to learn to throw up red flags over that. They haven’t.
$.99 was designed to move Sparrow up the paid apps lists (it never would have registered on the free apps list) and to secure a round of funding. The goal had always been to sell out to Google or someone else. It was never, ever going to be profitable, and anyone would knew enough to look at their sales stats and do some 3rd grade math would have known that.
Sparrow was still effectively a free app.
liberal
@RareSanity:
Yeah, I hedged because my knowledge of code/data/heap etc is pretty weak, so I wanted to blunt what I was saying so as not to stick my neck out. Personal foible.
Cris (without an H)
Jeepers, I didn’t know Ezra supported the war. Where was he blogging at the time? That was post-Pandagon, I’m sure.
Jay in Oregon
@? Martin:
I didn’t discover Sparrow until it was on the Mac App Store at $9.99; they also have a free (ad-supported) version.
I wasn’t aware that it was available any cheaper than that. I’m using the free version for now because the ad intrusion is minimal and I’m iffy about paying for an app that apparently has no future development planned. (And frankly, if they put in Gmail’s mute feature, I’d probably still pay for it.)
Roger Moore
@liberal:
I think the question of how serious you have to be about protecting your data depends on circumstances. A small company operating on a shoestring probably needs to spend its money elsewhere. A multinational company with datacenters on multiple continents is probably better off securing its data by replicating it between its existing branches. Cloud backup is probably better for a mid-sized company that needs to protect its data but is too small to be able to do everything itself.
BrianM
@RareSanity: Or you could scare up an old Perq workstation, which had user-programmable microcode. Never wrote any, but I had to read it (to translate it into C). At the level of microcode, even C’s memory model fibs.
Roger Moore
@liberal:
Because they still have to work with websites that depend on ad revenue. One of the agreements when the advertisers agreed to accept Do Not Track was that it would default to “off” in the browser and users would have to turn it on deliberately. There is an implicit threat in all of this that web sites that depend on ad revenue will retaliate against users who deprive them of that revenue and browsers that make it too easy to do so.
liberal
@Roger Moore:
That’s a good, concrete way of putting it.
cat
@RareSanity:
Let me guess, you are an EE that took to programming? Because that statement is wrong on several levels.
C is usually picked by the vendors due to low resources required for the runtime and the ABI of most OS conform to the calling convention C has, though this maybe a feedback loop.
The runtime is usually where the actual hardware interaction happens. They only hardware interaction C can do itself is when the hardware you are interacting with can be done via memory mapping, so you can map a pointer to do peeks/pokes, all other hardware has to be done via assembly not C.
C also has no mechanism for accessing registers on the CPU unless the vendor extends C.
cat
@RareSanity:
Its also the epitome of why C programmers are terrible software engineers. You can not possibly spend enough resources validating your homegrown solutions to common programming patterns.
Oh, and C can’t access every nook and cranny of your computer’s memory as the MMU is very capable of hiding large portions of it from you.
liberal
@Roger Moore:
That doesn’t really make any sense to me. I mean, it makes some kind of “political” sense, but as far as coding is concerned, I don’t see how they can truly retaliate. What would they do? Deny HTTP requests? Based on what token of information? Some hard-coded identifier on the motherboard? (I’m not all that acquainted with whatever trusted computing initiatives are involved in that.)
liberal
@cat:
Huh? The context of this discussion isn’t “C is the end-all-be-all of programming.” The context is the claim that, as far as pedagogy is concerned, C might be a good starting point because there’s some reflection of what’s going on under the hood, if not a 100% accurate reflection.
Furthermore, C is in terms of syntax a very simple, clean language.
RareSanity
@cat:
Yes, I am, as a matter of fact.
Let me venture my own guess. You are the typical Computer Science major (probably with a master’s degree), that thinks that endlessly studying various theories of how things should be done, has shit to do with actually getting shit done, on-time, on budget, and profitably?
Apparently, in all of your smug glory, you misunderstand what a C “runtime” is. You speak as if this runtime is any different than the language itself. The standard C library is merely a group of precompiled objects (mostly written in C themselves), that are linked in at compile time. You don’t have to use it, you’ll just have to re-write all of the functionality it contains.
The hardware interactions I’m talking about are interactions with the I/O pins and peripherals on a CPU. The standard C library, merely using combinations of C code and assembly language specific to the processor the code is being compiled for, that actually maps the various hardware registers in the CPU to memory locations.
So yes, if I were to so choose, I can use inline assembly code to directly access CPU registers in my C code, without the use of the standard C library.
That is exactly why C is used, what’s your point? No it’s not a feedback loop, it is a conscious decision made based on data, not just “Oh well, this is what has always been used, let’s just stick with it.”
That is the dumbest thing you have written yet. How the hell do you “extend” C? You can’t, unless you’re into writing your own compilers…and exactly which “vendors” are doing this “extending”?
C does have a mechanism for accessing registers…it’s called IN-LINE ASSEMBLY. Do you CompSci blowhards even learn about assembly language anymore?
It’s funny how arrogant you are in your ignorance.
So now you just move to name calling?
Fuck you!
What language do you think all of the garbage collection routines of Java, Python or any other “managed” language are written in? You condescending bastard, at some point, there has to be code that manually tracks and manages memory.
What, do you think all that shit is just magically built into RAM chips somehow?
You’ve gotta be fucking kidding me…
The MMU doesn’t protect shit, the kernel does. At the kernel level, I can access ANY address in memory I want. The only thing stopping me, is the size of memory connected to the MMU. If I request memory that is out of its’ range, it returns a page fault. Any memory that’s there, I can access.
It is the memory management routines of the KERNEL, that wall off segments of memory from being accessed. The MMU doesn’t know what the hell stack and heap is, all it knows is that memory chip #1 contains address locations w-x, and memory chip #2 contains memory locations y-z. It’s only function is to translate physical memory into logical addresses…that’s it.
Jeebus you are the perfect example of why I said that @liberal’s nephew should learn about C programming. You fucking new-school CompSci majors have no fucking clue how a damn computing system works in the real world.
Here’s a hint, it’s more complicated than the block in the pretty picture in your book on object oriented design theory. The code that inside of the CPU and MMU is either assembly or compiled C, and the highly intelligent people that meticulously design and implement these things, would disagree with your calling them “terrible software engineers”.
As do I…go fuck yourself you smug twit. NONE of your bloated candy-ass languages run without the engineers that put all of the work in to design and build the very system they are running on.
Lurking Canadian
@RareSanity: This. Although you can write C++ on the bare metal, as long as you aren’t using exception handling. C is really just portable, readable assembly language. Which is both its beauty and its danger, of course.
Lurking Canadian
@RareSanity: I would quibble a little bit. Yes, the kernel has absolute power, but modern CPUs do more than translate virtual addresses to real addresses. They also have to be able to implement protection schemes in hardware, too. System level code can override and/or monkey with that process, of course, but it’s not like there’s software checking to make sure each instruction fetch points at executable code.
RareSanity
@Lurking Canadian:
You won’t find any C++ in the “bowels” of any computer system, just too much overhead.
I wouldn’t necessarily go that far. If you have good coders, C definitely improves the readability (thus maintainability) over assembly.
The main thing is that if you understand just about any “high” level programming language, you can look at C and get the general idea of what a piece of code is doing.
In assembly, you have to actually know what the particular mnemonics for that particular processor are, to have any hope of understanding the code…and even then you may look at a block and say WTF? Then spend quite a bit of time with pencil and paper trying to figure out what is being accomplished.
I mean syntactically (not necessarily structurally), C and C++ are basically the same language. It is the human readability part that separates C from assembly.
Everybody understands if-then-else statements and for/while loops. It’s a whole different ballgame when everything is moves, adds, divides, branches and register swaps.
I’m getting a headache just thinking about it…
RareSanity
@Lurking Canadian:
Agreed.
However, those hardware implementations aren’t just active straight from the semiconductor manufacturer, those definitions have to be set during the boot process. I guess they could also be set using the configuration “fuses” in the processor, but I’m pretty sure I can still access and change those fuses programatically.
Now, I will submit that it could be the BIOS setting those limits instead of the kernel…but in the same breath, I’d say that most of any BIOS is also written in C, so one can still access those settings from C. It’s just a matter of “where” the code is running.
Lurking Canadian
@RareSanity:
I have seen embedded applications that run C++ right on top of the hardware. It can carry around a lot of overhead, but a decent compiler will only saddle you with the overhead for the features you actually use. If you need polymorphism, I’d rather trust the compiler to do it than try to roll my own. But I will reiterate that this means no throwing exceptions.
Anyway, I think you and I are in basic agreement. My slogan about “readable assembly” was not intended to be a criticism o C, or a comparison to C++. Rather it is my attempt to capture the degree to whic c does exactly what you want in exactly the way you think it will. This is not true of java, python, lisp or what-have-you.
RareSanity
@Lurking Canadian:
I would agree with this too.
I was speaking more from the memory block access level, not necessarily the instruction fetch level.
RareSanity
@Lurking Canadian:
I think you’re right, we’re saying the same thing, no need to beleaguer the fine details.
OhNoNotAgain
@RareSanity:
You ever get the impression that the programming biz has developed a bit of an authoritarian streak ? I’m amazed at how many times people answer stuff on StackOverflow, etc. with a “just use this, don’t try to program it yourself”. Which would be great, if not for the fact that we’re programmers.
Pseudonym
@RareSanity: Holy fuck you are a moron. The C language itself does not give direct access to hardware registers. That’s why you need to use inline assembly to access them. Inline assembly is assembly, not C. C is not the only language that supports inline assembly, and inline assembly is not part of the C standard. It is a vendor extension. Vendors such as Microsoft, Intel, and the GCC project have “extended” C to support this. I learned all about this in my CS masters program.
Also, how exactly does the kernel protect shit? In most modern non-embedded OSs it enforces isolation by giving each process its own address space. Guess what it uses to do this? A little piece of hardware called the MMU. Even at the kernel level you can only access memory that’s mapped in to the kernel’s address space by the MMU.
C does give a pretty low-level perspective on what the computer is doing. It’s not some sort of ground truth though.
So basically, you’re wrong and cat’s right, with the possible exception of accusing all C coders of being bad software engineers. It’s just the ones who choose C for some inappropriate task like typical web serving because it’s more “dangerous” and “low-level” and some bizarre kind of sexy, and then end up reinventing the wheel, badly.