×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Post-post PC: Materials and Technologies That Could Revive Enthusiast Computing

timothy posted about 7 months ago | from the more-conflict-minerals-mean-more-fun dept.

Upgrades 128

Dputiger writes "Given the recent emphasis on mobile computing and the difficulty of scaling large cores, it's easy to think that enthusiast computing is dead. Easy — but not necessarily true. There are multiple ways to attack the problem of continued scaling, including new semiconductor materials, specialized co-processor units that implement software applications in silicon, and enhanced cooling techniques to reduce on-die hot spots."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

128 comments

Arsenide is a material? (4, Informative)

Okian Warrior (537106) | about 7 months ago | (#44914545)

Of all the next-generation technologies that we’ve discussed at ET, including carbon nanotubes and graphene, III-V semiconductors that use materials like indium, gallium, and arsenide are by far the most likely to make an a mass market appearance within the next ten years.

[Emphasis mine]

Yeah, that article really seems to know what it's talking about.

Re:Arsenide is a material? (-1)

Anonymous Coward | about 7 months ago | (#44914583)

When I first read that headline I felt this sudden pain in my groin. Then I saw blood in my underwear. I think I need to go to the doctor...

Re:Arsenide is a material? (0)

Anonymous Coward | about 7 months ago | (#44914887)

When I first read that headline I felt this sudden pain in my groin. Then I saw blood in my underwear. I think I need to go to the doctor...

Congratulations on your first menstrual period.

Re:Arsenide is a material? (0, Funny)

Anonymous Coward | about 7 months ago | (#44915173)

I had that happen to me, too, once. I was certain that I had got it--my first period. But it turned out to be related to my penis. When I went to the Dr. to ask about my menstrual cycle, he picked it up right away that my penis looked very strange. Indeed, he had never seen a penis that looked like mine. I had always thought that my penis was normal, but now I was afraid. Was it my fault? Did I do something wrong that caused my penis to develop this way? Will I ever be able to get pregnant?

My Dr. suggested surgery, or at least serious psychotherapy. But none of those options appealed to me. I became depressed and withdrawn. I lost my job at the supermarket and resorted to petty crime and prostitution to survive. I became addicted to meth. Eventually, I found myself in jail, where everyone seemed to have a much bigger penis than I have. Finally, I realized the problem. The penis must indeed go. I found a lawyer willing to work pro boner and petitioned the state for a surgical solution to my continued confusion. We won. I was scheduled for my procedure, as well as the follow-up whormonal support, when all of a sudden it was cancelled out of the blue. The governor, in a tight race for re-erection, had enacted an order to halt my operation, saying that it was a waste of state funds. So here I am, caught in limbo, with a tiny penis that bleeds every month. Please, I implore all of you, help me get out of here.

You can send donations to the following address:

[REDACTED ON ADVICE OF COUNCIL]

Re:Arsenide is a material? (2, Funny)

Anonymous Coward | about 7 months ago | (#44915327)

At LeAsT YoU aRe StIlL hUmAn.
WaKiNg Up As A cOcKrOaCh WaS tRuLy A sHoCk.
BuT i FeEl As ThOuGh SlAsHdOt Is A hOmE.

The article wasn't written for techies (2)

Taco Cowboy (5327) | about 7 months ago | (#44914903)

... processor units that implement software applications in silicon ...

Isn't that the definition of an ASIC ?

With the gaffe the OP has pointed out (Gallium Arsenide becomes Gallium and Arsenide) and this ... I get the impression that the article's target audiences shouldn't be the techies

Re:Arsenide is a material? (0)

Anonymous Coward | about 7 months ago | (#44915333)

Spent hundreds for new underwear? Want to enjoy reading horrible headlines relating to articles about chips? Try Chipotlaway! Just one horrible headline can lead to up to quarter cup of underwear blood. But Chipotlaway makes your underwear clean and ready for more!

Re:Arsenide is a material? (3, Funny)

bill_mcgonigle (4333) | about 7 months ago | (#44914697)

indium, gallium, and arsenide

See, that's why tech writers have editors, who can correct 'indium and gallium arsenide' to 'indium, gallium, and arsenide'.

Re:Arsenide is a material? (1)

Anonymous Coward | about 7 months ago | (#44914913)

>> arsenide

What's wrong with this? It's a metal compounded with English people's butts.

Re:Arsenide is a material? (3, Informative)

Anonymous Coward | about 7 months ago | (#44915101)

Arsenides are indeed a class of materials containing the element arsenic that includes In-Ga-As semiconductors. But let me try to fix the original sentence, it's not as bad as you imply, though definitely incorrect.

Of all the next-generation technologies that we’ve discussed at ET, including carbon nanotubes and graphene, III-V semiconductors that use elements like indium, gallium, and arsenic are by far the most likely to make an a mass market appearance within the next ten years.

Changes in bold. Indium and gallium are the group III elements and arsenic the group V element that make up III-V semiconductors. Poorly edited yes, but not enough to disqualify the whole article, at least in my humble anonymous opinion...

Re:Arsenide is a material? (3, Insightful)

SuricouRaven (1897204) | about 7 months ago | (#44916533)

The writer, having done the research, would be unlikely to make a mistake like that. It's more likely a 'correction' performed by the editor, who mistakenly interpreted the sentence as a gramatical error. Easy for someone to see 'gallium arsenide' and misinterpret it as the list 'gallium, arsenide' with a missing comma.

Re:Arsenide is a material? (3, Informative)

wjcofkc (964165) | about 7 months ago | (#44915199)

Take a second look. I made a post in their Disqus comments pointing out this error. The author of the article replied in less then ten minutes acknowledging the error with a promise to fix it. The error was fixed by the time I hit refresh. Instead of being all high and mighty, perhaps next time you should help out. I did, and it worked. Consequently your entire post is now moot.

Re:Arsenide is a material? (0, Redundant)

wjcofkc (964165) | about 7 months ago | (#44915249)

Take a second look. I made a post in their Disqus comments pointing out this error. The author of the article replied in less then ten minutes acknowledging the error with a promise to fix it. The error was fixed by the time I hit refresh. Instead of being all high and mighty, perhaps next time you should help out. I did, and it worked.

Re:Arsenide is a material? (2, Funny)

Anonymous Coward | about 7 months ago | (#44915297)

Take a second look.

I already took a second look on your previous post. This would be the third.

High and mighty? (-1, Redundant)

Okian Warrior (537106) | about 7 months ago | (#44915359)

Disqus comments are blocked by ghostery. Are you saying that people shouldn't run ghostery?

Instead of being all high and mighty, perhaps next time you should help out...

Why?

Historically speaking, helping out doesn't help. Putting a warning (of sorts) in this forum lets a whole bunch of potential readers save time.

That's not being "high and mighty", that's making an assessment of quality... which is what our teachers do and what we are taught to do ourselves.

"High and mighty" is calling your own actions better than the actions of someone else, which is what you did.

Re:High and mighty? (4, Insightful)

Joining Yet Again (2992179) | about 7 months ago | (#44916469)

Historically speaking, helping out doesn't help.

It sounds like you're a cunt and have no idea how to help people.

IME, helping out nearly always helps.

Re:High and mighty? (1)

flyneye (84093) | about 7 months ago | (#44916995)

Disqus is only there to make you look at ads, not add your $.02 to a forum discussion.
They aren't looking for $.02 obviously, it is censored to exclude anything remotely controversial to popular thought.
That leaves us with the purpose being ad revenue. It keeps you on whatever page for longer and multiple refreshes so they can show that you saw ads. They make money, you get shit for your time and contribution.
When I see a Disqus box, I know it's just a sucker trap.

Re:Arsenide is a material? (2)

asm2750 (1124425) | about 7 months ago | (#44915533)

GaAs semiconductors have been around for years. The issue is it sucks at oxide growth and therefore makes it expensive to fab.

You can get around this by adding aluminum to GaAs, creating a hetrojunction transistor. Other materials like Indium can be used as well.

The beauty of these materials is you can get different bandgaps making it possible to create a true multijunction solar cell bumping up the conversion efficiency to around 40% which is almost unheard of in normal Silicon solar cells. The devices also have the advantage of running at multi GHz speeds with little issue.

You need an update (0)

Anonymous Coward | about 7 months ago | (#44915673)

The beauty of these materials is you can get different bandgaps making it possible to create a true multijunction solar cell bumping up the conversion efficiency to around 40%

http://phys.org/news/2013-02-multijunction-solar-cell-efficiency-goal.html

Multijunction solar cell could exceed 50% efficiency goal

Re:Arsenide is a material? (1)

dbIII (701233) | about 7 months ago | (#44915819)

indium, gallium, and arsenide

Well, without the commas that's a material which people in labs have been using to get diodes as thin as a single atomic layer for nearly a couple of decades, so the tech journalist is nearly there and better than some. The stuff I saw in about 2000 was put on by chemical vapour deposition, which is a fairly cheap way to do things, but of course a very thin diode junction is small in one dimension but needs improved masking technology to be small in 3D.

Re:Arsenide is a material? (-1)

Anonymous Coward | about 7 months ago | (#44916389)

Arsenide is closely related to uranus.

Re:Arsenide is a material? (1)

Redmancometh (2676319) | about 7 months ago | (#44917141)

So I don't see any reason that this would live to "enthusiast computing" (I read this as stuff made at home), but I don't see any problem with the statement you quoted.

I'm assuming you're saying arsenide should be quoted as the entire compound, or that it should be gallium arsenide. It's not exactly an egregious error since it can be solved by adding an s to the end. If they had said "arsenides" it would have been correct - not far off.

I say arsenides would have been correct since IIRC they use several arsenide salts especially aluminum and gallium arsenide.

Or I missed something.

Nope (3, Interesting)

Kjella (173770) | about 7 months ago | (#44914565)

Reality is that "enthusiast" computing today depends on what companies care to provide as "slightly ahead of the current state-of-art" at exorbitant prices. Intel's not going to launch a new CPU for enthusiasts. AMD isn't going to launch a new CPU for enthusiasts. If they do it's just because they can cherry pick some CPUs from their server process (Intel) or that can perform exceptionally well for equally high power consumption (AMD). It is so insignificant to the overall market that progress would happen the same with or without them. We're just not a significant enough portion of the market to really warrant a new process or capacity or whatever.

Re:Nope (2)

bill_mcgonigle (4333) | about 7 months ago | (#44914711)

"slightly ahead of the current state-of-art"

Just to pick nits, it is the current state of the art, and just slightly ahead of commodity on any non-miniscule timescale.

Re:Nope (0, Insightful)

Anonymous Coward | about 7 months ago | (#44914765)

Except for LadyAda and her company, Adafruit!!!!

        http://www.adafruit.com/

Geek toys, workable prices, n the gaping void that the old "build 20 electronic devices at home!" kits used to fill. Bless her black flabby little heart for this business.

Re:Nope (0, Informative)

Anonymous Coward | about 7 months ago | (#44915183)

Good luck making a good PC with ARM and other non-power-hungry devices. In practice we still depend on Intel and AMD willingness to make something better. Yes you can make an ARM cluster, but clusters are only useful sor some usecases.

Re:Nope (3, Insightful)

b4upoo (166390) | about 7 months ago | (#44915161)

The funny thing is that when lightning bolt like breakthroughs hit people almost never know from whence they come. Somehow I get a pic of a kid with a handful of Raspberry Pi units somehow feeding in and out of a multicore processor with a smartphone somehow involved crunching magical equations that leave my jaw hanging down. It is almost like the mathematicians at Oxford getting mail from an unknown person in a mud hut in India with solutions for equations that nobody has ever been able to do before. Genius is a sneaky quality. It lives where it likes and resides in unlikely meat bodies.

Re:Nope (1)

SuricouRaven (1897204) | about 7 months ago | (#44916547)

Intel *does* make CPUs for enthusiasts - the i7 range, which give the best performance current technology can give at the top end and £1000+ prices. They don't sell in enough volume to make a ton of money - the cash-cow is the midrange stuff, the i3 and i5. They are important for company reputation, keeping Intel firmly established as the King of Semiconductors: They can make the fastest chips around.

Re:Nope (1)

jedidiah (1196) | about 7 months ago | (#44917721)

Nope. Enthusiast computing only depends on being able to do your own thing. That can be leading edge performance or trailing edge performance. It all depends on your particular use case.

One key to remember here is that "ahead of the state-of-the art" is actually pretty trivial to achieve when your yardstick is ARM based gear.

That goes for performance as well as flexibility.

Computers are commodities (0)

Anonymous Coward | about 7 months ago | (#44914589)

Get over it. There are plenty of other things you can tinker with if the urge strikes. RC boats and helicopters come to mind.

Re:Computers are commodities (1)

The Snowman (116231) | about 7 months ago | (#44915261)

Get over it. There are plenty of other things you can tinker with if the urge strikes. RC boats and helicopters come to mind.

Personal suicide machines? I think not [go.com]!

So, enthusiast computing switches to ... (1, Insightful)

DavidClarkeHR (2769805) | about 7 months ago | (#44914599)

So, enthusiast computing switches to either smaller devices, or focuses on software development.

Doesn't really matter - how many companies cater to 'horse-and-buggy' enthusiasts, after all?

Re:So, enthusiast computing switches to ... (2)

westlake (615356) | about 7 months ago | (#44915005)

Doesn't really matter - how many companies cater to 'horse-and-buggy' enthusiasts, after all?

Quite a few, actually. Horse Drawn Hearse [liveryone.net]

Re:So, enthusiast computing switches to ... (0)

Anonymous Coward | about 7 months ago | (#44915059)

Doesn't really matter - how many companies cater to 'horse-and-buggy' enthusiasts, after all?

Quite a few, actually. Horse Drawn Hearse [liveryone.net]

Nice. More than computer shops, eh? OH SNAAAAAP.

Re:So, enthusiast computing switches to ... (1)

Joining Yet Again (2992179) | about 7 months ago | (#44916515)

For people who don't make lots of looong journeys, running a horse can be cheaper, less impactful, etc.

Indeed, I only stopped riding horses around here because there were too many cars.

The whole "horses are outdated" thing is like the "everyone rushed to the cities for a better job" Industrial Revolution myth: a lot of it was huge landowners making it untenable to continue renting their land, because they wanted to push people into the cities for more profitable work.

I want a Geek Port! (1)

Moof123 (1292134) | about 7 months ago | (#44914627)

I know there are solutions out there, but pure GHz means little to me these days. I want to actually do stuff with my PC besides play games and surf the web, my tablet has taken over those duties.

Maybe a USB Geek port so even my tablet can get in on the action?

Re:I want a Geek Port! (1)

32771 (906153) | about 7 months ago | (#44916863)

You can build one yourself. Assume the BeBox geek port:

One "GeekPort" (37-pin D-shell)

        An experimental-electronic-development oriented port, backed by three fuses on the mainboard.
        Digital and analog I/O and DC power connector, 37-pin connector on the ISA bus.
        Two independent, bidirectional 8-bit ports
        Four A/D pins routing to a 12-bit A/D converter
        Four D/A pins connected to an independent 8-bit D/A converter
        Two signal ground reference pins
        Eleven power and ground pins:
                Two at +5 V, one at +12 V, one at -12 V, seven ground pins.

My current favourite are STM32 discovery boards, i.e.:
http://www.st.com/web/catalog/tools/FM116/SC959/SS1532/PF254044 [st.com]

(costs $10)

It even has a User USB port, you can use. Then you have to invest some time implementing the whole
communication chain to your PC. ST generally has a standard peripheral driver library that can help you
with all the stuff you can do with the chip in question.

Apart from the powersupply stuff the bebox offers, you will find everything else and more in the microcontroller.
If you want to build an external geekbox you would want to add a separate power supply. Then you could use the microcontrollers PWM outputs
to regulate some voltages for a power supply output, or use some dc/dc converters for fixed voltages.

From the top of my head I wouldn't know what to do with it though, but I do need a power supply unit, some measurement means, and some stimulation means for circuits.

Re:I want a Geek Port! (1)

32771 (906153) | about 7 months ago | (#44916881)

Oh right, I meant to say tablet. Tablets support serial usb devices, and serial USB ports can be pretty fast too. You still need a PC to program the thing unless you can get openocd to work on your tablet.

That word may not mean what you think it means... (1)

GreyLurk (35139) | about 7 months ago | (#44914707)

I don't think I would say "Enthusiast Computing" are limited to people who upgrade their processor to the latest and greatest every 6 months. I would rather call those folks "PC Game Enthusiasts". I would call Enthusiast Computing things more like building Beagle Bone/Raspberry Pi clusters, or people doing more interesting things than just installing new motherboards constantly.

No (5, Informative)

The Cat (19816) | about 7 months ago | (#44914715)

There is no such thing as post-PC for the same reason there is no such thing as "post-doorknob" or "post-handle."

The PC is the correct form factor for getting work done by humans. Mobile devices are not. This will only change if human physiology changes, which is unlikely in any time frame measured in intervals shorter than 100,000 years.

The "post-PC era" is a marketing slogan designed to make you buy things. It is designed to get you back on the upgrade treadmill starting from the beginning again. It is not technologically accurate.

PCs are here to stay for a very VERY long time. Get used to them.

Re:No (0)

Guy Harris (3803) | about 7 months ago | (#44914891)

The PC is the correct form factor for getting work done by humans. Mobile devices are not.

I'd say, instead, that the desktop and laptop PC are the correct form factors for getting done the sort of work that you do when seated for a long time. There are probably people whose work is sometimes done while on the move and for which a desktop PC is obviously not going to work and for whom a laptop PC might not work very well; consider, for example, somebody managing a construction project who might need to look things up, enter data, do some calculations, etc. while on site. I suspect that a mobile phone would be the wrong form factor for them, but a tablet might be the right form factor.

(Mobility isn't a Boolean property; the easier it is to carry a computer, the more mobile it is. The ability to run on battery power, and to use wireless networking, helps a lot, too; it's conceivable that you could build a easy-to-carry computer that required you to plug it in, but I doubt there's enough interest in that to have many computers built that way.)

Re:No (2)

unkiereamus (1061340) | about 7 months ago | (#44915517)

The PC is the correct form factor for getting work done by humans. Mobile devices are not.

I'd say, instead, that the desktop and laptop PC are the correct form factors for getting done the sort of work that you do when seated for a long time. There are probably people whose work is sometimes done while on the move and for which a desktop PC is obviously not going to work and for whom a laptop PC might not work very well; consider, for example, somebody managing a construction project who might need to look things up, enter data, do some calculations, etc. while on site. I suspect that a mobile phone would be the wrong form factor for them, but a tablet might be the right form factor.

I'll actually give you a primary source, real life example.

I'm a paramedic, every single patient for whom I have responsibility of care for, I have to generate documentation for. Up until about 2008, that meant actual paperwork, about then, the industry as a whole being phasing in electronic medical records. To the business office, they're great, because billing the patients, and keeping the records is much easier, and for me, the end user of the system, it's great because, especially when you're using a touch screen and a properly designed program, the computer is much faster and easier to use than a pen (especially for me, my handwriting sucks balls.)

Up until this point, and I imagine for a while into the future, the solution of choice has been to use the Panasonic toughbooks that will convert to a tablet form factor (CF-18s,-19 etc). The touchscreen is necessary, because we actually use it to collect signatures (quite aside from the fact that stabbing at options on the screen can easily cover 99.5% of the use cases), the portability was of course necessary, because I start my paperwork in the pt's house and finish in the ER, the keyboard was necessary because I have to type up a narrative for each pt, and the ruggedness was necessary because we beat the hell out of our machines. e problem is, even the older machines are WAY more powerful than we need, not to mention being heavy as hell (remember, I have to hand these to 96 yo pts to get a signature.)

These EMR suites are starting to be developed for tablets, both iOS and android, and the market is starting to come up with workable ruggedized tablets. nce we get over the industrial inertia we have (which is surprisingly significant, given how agile we're supposed to be), we're going to move to tablets with some sort of external keyboard (at a guess, at odds with the ruggedized tablet, the preference will be for keyboards cheap enough to be effectively disposable), and it will be the right solution for us.

No, my biller and office manager will still be using a full blown PC, but in the field, not so much.

Re:No (1)

Guy Harris (3803) | about 7 months ago | (#44916179)

I'll actually give you a primary source, real life example.

Thank you x 10^6.

(My example was also actually also a real-world example, not a hypothetical example, but was a case of somebody doing that sort of white-collar construction work who asked for my advice on machines to buy, rather than somebody who had that machine already; he already has PCs at home and, I think, at work, but needed something for when he's actually at the construction site.)

Re:No (1)

locopuyo (1433631) | about 7 months ago | (#44916341)

That is a good example of work you can do with a tablet. But that isn't replacing work you would do on a PC. That is replacing work you would do with a pen and paper, or a laptop in your case.

Re:No (1)

unkiereamus (1061340) | about 7 months ago | (#44916497)

Except that it is replacing work I would do on a PC.

First, let's get rid of the notion that laptops are inferior species spec wise, compared to a server, they are, compared to a high end desktop they are, but my primary computer is a laptop, this is because I spend 48 to 96 hours straight at work, so it just makes sense for me to have a computer I can take with me. In fact, I would go so far as to say that my laptop, which is a higher end model, but certainly not the highest end, is superior, spec wise, to at least 80%, and probably 90% of the desktops that are in use today. And you know what I get when I put my laptop on a desk rather than my lap? A somewhat odd form factor desktop.

Second, You know what services that can't afford toughbooks do these days? The answer isn't "don't use EMR" because that's really not an option any more. They have the field employees fill out paper while they're in the truck, then when they get back to base, they have to duplicate all that information into a desktop using a variety of services (imagetrend being the 800lbs gorilla in the field).

I dunno, I had a beautiful woman asking me why I wasn't drinking with her as I finished typing up the last comment, and now I'm in the after effects of having a beautiful woman demanding that I drink with her, I'm not sure I'm making sense, but I'm pretty convinced that I'm right...but my etoh level suggests I might be full of shit.

Re:No (1)

jedidiah (1196) | about 7 months ago | (#44917751)

> Except that it is replacing work I would do on a PC.

No not really.

> First, let's get rid of the notion that laptops are inferior species spec wise,

Why? That's just an artificial idea you need to latch onto to make your argument work. It's not necessarily true or valid.

Conventional PCs have been shoehorned into a lot of areas where they aren't the best fit. A lot of tablet "productivity" use cases are merely a reflection of this.

Laptops aren't "spec inferior", they are mobile.

Re:No (1)

Joining Yet Again (2992179) | about 7 months ago | (#44916531)

So you moved from paper to tablet to tablet-on-a-crippled-OS,

In the UK, portable computing devices suited for data collection have been available since late '80s from Psion. And I got my first tablet PC over a decade ago, with a proper stylus - more usable for non-trivial work than thumbing an Android.

Re:No (1)

Guy Harris (3803) | about 7 months ago | (#44914975)

The "post-PC era" is a marketing slogan designed to make you buy things.

And to read columns blathering on about the "post-PC era". It's all about the CPM [wikipedia.org], err, umm, the CPI [wikipedia.org].

Not that TFA has that much to do with consequences of the "post-PC era"; they say "Is the PC enthusiast market dead, a casualty of the push into mobile?" (and answer the question in the negative), but that's all I could find. They probably slapped it onto the title just to get people's attention.

Re:No (2)

rahvin112 (446269) | about 7 months ago | (#44915787)

Yes and no. Tablets, phones and everything in between are replacing an aspect of computing, and that's strict consumption. Grandma doesn't need a PC to read email or look at grandkids photo's, she can use a tablet and have a much better user experience and gains the benefits of portability and reduced power use. People will use them to read books, watch movies or browse the internet. In general they won't be using them to create anything.

Almost everyone I know has a tablet, me and my wife each have one. We use them for light content consumption and casual entertainment. This is the use that the vast majority of people using tablets are using them for. That's a niche that isn't going away. Tablets are here to stay, just like PC's will continue to be used in business and computing that involves real work or creation of anything.

What's changed in this area and will likely be detrimental to the whole business is that PC's are now good enough. The CPU's are far more powerful than most people need except for special areas like engineering. But on the flipside Tablets aren't going to be yearly, bi-yearly or even tri-yearly purchases, people won't generally be buying to upgrade. They, for the most part, provide everything that's needed right now. The only way they will be able to drive sales is by making them lighter or have longer battery life. Otherwise sales will be related to breakage (or the hardware wearing out) and population expansion which will mean significantly less sales than the initial sales where everyone bought one. I personally expect tablet sales to drop off precipitously over the next 5 years.

So year, there isn't a post-pc world, but tablets aren't going anywhere either and the tablet form factor satisfies a LOT of general use among the general population. Combined with the fact that CPU's are generally good enough means PC's sales are going to remain stagnant for the foreseeable future. Tablet sales will also likely stagnate or decline significantly once the market has saturated. Neither will be seeing the sales the PC's have enjoyed for the last 30 years and it's going to be severely detrimental to the business as a whole.

Re:No (1)

ShoulderOfOrion (646118) | about 7 months ago | (#44916293)

Actually, Grandma does better with a PC, with a real keyboard and mouse and a large monitor with large easy-to-read fonts. Agreed, she doesn't a tower case with dual water-cooled CrossFire GPUs like her grandson. A simple little cube thing with some USB ports, HDMI/DVI and an audio output are sufficient. Grandma's eyes aren't good enough to see a tablet screen, her hands aren't steady enough to manipulate small touchscreens, and she can't hold a tablet and a small dog/cat in her lap at the same time.

Re:No (1)

stenvar (2789879) | about 7 months ago | (#44916045)

The PC is the correct form factor for getting work done by humans. Mobile devices are not.

Oh, I think there are better form factors. Take a look at a traditional workspace: a huge desktop/drafting table, dozens of documents/pages, and walls. Now imagine the desktop, the documents, and the walls all turning into smart, active displays. That's the correct form factor for humans. A 27" HD monitor and a noisy metal box on the floor are not.

Re:No (0)

Anonymous Coward | about 7 months ago | (#44916283)

Whether a computer is mobile is separate from its interface. You might find yourself in a world where everyone has a mobile computer in their pocket or implanted, and when they sit down to do real work, they are just sitting down at a screen and keyboard that wirelessly connects to their mobile device and runs on that. With wireless power the mobile device might be charged while using it in this way. That mobile device might then use an internet connection to access further computation power from a far-off data center when necessary. That would be a post-PC world, unless you count the mobile device as a PC, in which case you've redefined PC to just mean computer.

If we can get much faster and lower-latency internet, wireless charging, mobile devices with significantly greater computation power, low-latency wireless connections to screens, low-latency wireless keyboards and mice and, most significantly, somehow get cloud computing to be NSA-proof, then I'd even prefer this setup to a conventional PC - but I'm sure that lots of people wouldn't worry about most of that stuff. We could do something like this today. I think the main stumbling block is wireless charging and the computational power and heat dissipation capabilities of the mobile devices we've got today. To do this today, we'd probably need a powered docking station for the mobile with a more beefy CPU inside it, at which point you might as well just get a PC. There isn't even a software issue, really - the otherwise crappy hybrid Windows 8 interface is actually just right for this sort of thing.

Re:No (1)

Antonovich (1354565) | about 7 months ago | (#44916499)

Rubbish. You have obviously not been paying attention to the advances in HCI tech. So unless you are saying that a "very VERY long time" is in the 15-30 year ballpark, you are quite simply wrong. Sure, 30 years IS a long time considering how long computers have been around but I certainly hope I'll still be around then. Just like fixed-line phones are quickly becoming a thing of the past, so too will other devices that don't move easily so a person can quickly be fully productive in any quiet, semi-private space they decide to work. There is still quite a bit of work to do, and the keyboard will certainly take a while to be unseated from its current productivity throne but it will happen without a shadow of a doubt, well before we hit the 100k year mark.

Re:No (3, Funny)

Anonymous Coward | about 7 months ago | (#44916941)

>> The PC is the correct form factor for getting work done by humans.

Unfortunately most humans just want to play Angry Birds while taking a crap.

Re:No (1)

CyberNigma (878283) | about 7 months ago | (#44917459)

I'd say that A decent size monitor, full sized keyboard, and mouse is the current dominant form factor for getting work done. Whether they are connected to a desktop or mobile device is irrelevant to our physiology.

That said, performance of the device connected to the monitor, keyboard, and mouse is what should be considered for productivity.

source: See computing history from mainframes to minicomputers to microcomputers to mobile devices for their form factor relevance..

"Enthusiast computing" (0)

Anonymous Coward | about 7 months ago | (#44914729)

You mean graphics rendering. Enthusiast computing died in the 80s when you no longer had to write your own software or build your computer from scratch.

The next frontier in computing is parallel processing, and we will be treading ground already walked decades ago by supercomputers: now we can fit all that performance on your desktop.

Rumors of Si Death Have Been Greatly Exaggeratted (3, Interesting)

BarneyGuarder (44042) | about 7 months ago | (#44914753)

The new semiconductor technology angle in the article seem highly fishy to me. Apart from the fact that the statement felt like it may have said "In 10 years we will all be living in colonies on the moon", III-V materials have been losing market share to silicon for decades.

The article mentions that great electron mobility of the III-V materials, which is true, but forgets to mention that they had poor hole mobility. Now I am not a process expert, so maybe there are new techniques to address this. However, over the past 20 years or so this meant that you couldn't make very good CMOS logic and had to use NMOS only architectures. This and the poor scaling has kept the III-Vs away from large scale integrated logic chips.

The III-V devices were used in RF circuits, but they were replaced by Si-Ge and now many RF circuits use regular silicon processes. The III-Vs are still useful for optics.

The truth is that silicon has many problems that may prevent the industry from continuing to scale circuits to smaller geometries and the available workarounds are generally painful. But, the other options are worse.

Maybe in 10 years we will all be using cell phones that use carbon nanotubes... in our colonies on the moon.

owning your machine (5, Insightful)

Gothmolly (148874) | about 7 months ago | (#44914897)

An enthusiast wants to own his hardware, he doesn't care about 5.1 GHz uber-core machines. What the enthusiast wants is open specs, common interfaces, accessible GPIO, non-DRM memory or hardware, and open source code. Someone who buys the latest stuff from Intel and slaps Win 8.1 or Ubuntu on it so that they can run WoW is not an enthusiast they're just a rich consumer.

Re:owning your machine (1, Insightful)

tepples (727027) | about 7 months ago | (#44915111)

What the enthusiast wants is open specs, common interfaces, accessible GPIO, non-DRM memory or hardware, and open source code.

Unfortunately, enthusiasts like you and me are in the minority. The fact that people buy locked-down video game consoles for ease of use [slashdot.org] is evidence that the majority don't care about owning their devices. It's unclear whether there are still enough enthusiasts to sustain a market for such owner-respecting computing devices.

Re:owning your machine (2)

Kjella (173770) | about 7 months ago | (#44916681)

Both those with vintage restored spit polished classic cars and the ones with souped up race track cars are enthusiasts just in a completely different fashion. In your world only the tinkerers are "real" enthusiasts and the people who want a car that can handle 150 mph well are just rich customers. Nobody but an enthusiast would ever start tweaking DRAM timings or the BCLK or look at anything considered "exotic cooling", even if squeezing the last FPS out of their closed-source game with DRM on closed-source OS with DRM on closed spec hardware with DRM isn't your kind of enthusiast. The millionaires that simply buy the best computer money can buy are extremely few compared to all the hardcore overclockers and tweakers who really do care and invest time and effort into building the computer version of a dragster car.

Non-enthusiasts don't care about much of anything anymore, they don't push the limits any more than soccer moms driving their kids to soccer practice. Maybe there was a time when the average computer user felt the difference, but it was a long time ago. Any modern computer is fine, it's like the car that's just supposed to get you from A to B. And if you start talking to them about a walled garden, they think of it more like only being able to drive on roads while you praise the virtues of an off-roader. Sure it can go more places, but not any they know or care about. Most of them are very happy letting everything go through the cloud now, easy backups and synchronization of everything. Not even the NSA revelations will win over convenience. If you've handed over the keys to all your data you might as well hand over the keys to the computer too...

Sit down, Son (-1)

Ol Olsoc (1175323) | about 7 months ago | (#44914967)

I have some bad news for you.

Computers as we know tham are going away. I'm not talking about the silliness that is spouted about how tablets or smartphones are going to kill the desktop. As long as people need screen real estate, there will be the need for desktops.

No, this is something deeper. That 386 screamer with a couple meg of ram, a 40 Meg Hard drive and a VGA card that you put together in high school just isn't going to happen. I still get nostalgic about typing in escape codes to print stuff, but those days are gone.

The field is becoming mature, and the point of assembling your own computer, and getting it to work is just not what it used to be. Perhaps some niches, like people wanting to assemble the most awesome game machine ever, or there are some great steampunked computers out there. But that isn't the spirit that we had when we were getting this ball rolling.

Time to move on to something else new if you want to be in a new field. It's not like computers have no where to go, but perhaps analogous to early Amateur Radio, wher eeveryone rolled their own radios and antennas. Not many put together a tube radio to do AM on the 160 meter band any more. Now if you dare build your own, you are looking at surface mount SDR radios, or re-creating stuff that has been around a long time.

Personally, I took the SDR radio route. But still, I can't build anything that I can't buy better.

Re:Sit down, Son (0)

Anonymous Coward | about 7 months ago | (#44915057)

I am ready for the "Hybrid PC" where there are EEPROM chips for user level programs. All they have to do is add a couple to the mainboard, and the programs you use can write to it. When you run your programs, it checks if it is in hardware, if so, then it runs from there. For example, if you have some program like firefox, it would have an option to "write to system", then it would analyze your current hardware/software environment, then prepare a file to match, then write to EEPROM, or tell you that your current PC doesn't support it, etc. Future program calls would run firefox from the EEPROM. Obviously, some programs would benefit more than others. It would be like having a GPU for programs. Whaddya think?

ReadyBoost (1)

tepples (727027) | about 7 months ago | (#44915087)

The feature you describe (using flash EEPROM as a cache for the hard drive) has been around for years. Windows calls it "ReadyBoost". Or did I miss the joke?

Re:Sit down, Son (0)

Anonymous Coward | about 7 months ago | (#44915223)

There's a problem with your argument. Assembling a radio form circuitry and a computer from component boards has very different implications. We'll never see prefab computers that can match a thought out custom build because these companies are too fixated on maintaining business deals with each other to make it happen.

I would love to be able to call up a computer company and tell them what I'll use my PC for and how long I plan to make it last and have them send something that matches those requirements. That's just not how the current business model works for computer companies, no you get a choice of a few general purpose machines that don't even excel in the area they are supposedly designed for.

Re:Sit down, Son (1)

Rockoon (1252108) | about 7 months ago | (#44916763)

The field is becoming mature, and the point of assembling your own computer, and getting it to work is just not what it used to be.

Yes it is. The only difference is that now the prefabbed computers are a lot closer in price (frequently cheaper!) to what it would cost for you to build it yourself with equal components.

You still get to mix and match components that cannot be found in mainstream prefabbed computers, and in those cases you are still significantly better off money-wise building it yourself. As an example any sort of silent PC setup isnt mainstream, so you pay a significant premium having someone else build it for you.

enthusiast computing is getting smaller (2)

kawabago (551139) | about 7 months ago | (#44914987)

arduino, raspberry pi, et. al. In fact my next desktop may be a cluster of ten or more SOCs.

Re:enthusiast computing is getting smaller (0)

Anonymous Coward | about 7 months ago | (#44917253)

Check out the minnowboard [minnowboard.org] for something with a little more power.

Is this article a reprint? (1)

shess (31691) | about 7 months ago | (#44914989)

Gallium arsenide has been just about to replace silicon for 25 years, now. And Transputers were invented in the 80s. Sure, maybe it's finally time for these to hit the mass market, but one would be ill-advised to hold one's breath waiting for it.

Re:Is this article a reprint? (1)

marcosdumay (620877) | about 7 months ago | (#44915105)

If it is indeed time to change (why change into GaAs? If you are changing, why not carbon?) you can expect that change to happen in a 15 to 25 years journey, as no fab is prepared for that, and no process works well on the next substrate yet.

One really shouldn't hold one's breath.

Re:Is this article a reprint? (1)

SuricouRaven (1897204) | about 7 months ago | (#44916563)

Fab issues. There are no economical-at-scale ways to manufacture graphene processors, even if certain engineering issues (poor band gap) are solved. But GaS is a well-established technology, been around for decades - all it needs is a few incrimental improvemenents, no need for revolutionary new science to support it.

Lack of a use case (3, Informative)

Animats (122034) | about 7 months ago | (#44915003)

From the article:

Programs like "Mail" or "Messages" could be implemented in reprogrammable silicon.

You need how much compute power to read mail?

Most users just don't need that much power. Once everybody could play streaming HDTV, the couch potato market was covered. Rendering in gaming could still improve, and NPC behavior could get smarter, but really, GTA V pretty much has that nailed and it runs on last-generation consoles.

There are people who need more power, but they're running fluid dynamics simulations or rendering movies or simulating new ICs or something like that. I've run Autodesk Inventor on 24-CPU workstations. That's one of the few interactive programs that can usefully use a 24-CPU workstation. It's not a mass market product.

The applications that need vast amounts of additional compute power are there, but they're not high-volume applications. Nor are they "enthusiast" applications. There's not enough volume there to justify heavy investment in faster CPUs.

This may change as we have better robots or something like that. But speeding up existing desktop apps, no. (Program load times are still ridiculous long, but mostly because of stupidity like phoning home for updates, waiting for the license server, fetching ads, or using virtual memory in a world where memory is cheap.)

Re:Lack of a use case (1)

citizenr (871508) | about 7 months ago | (#44915247)

There are people who need more power, but they're running fluid dynamics simulations or rendering movies or simulating new ICs or something like that. I've run Autodesk Inventor on 24-CPU workstations. That's one of the few interactive programs that can usefully use a 24-CPU workstation. It's not a mass market product.

In a 1-2 years EVERY single new game will use 8 cpu cores by default.

Re:Lack of a use case (0)

Anonymous Coward | about 7 months ago | (#44915291)

There are people who need more power, but they're running fluid dynamics simulations or rendering movies or simulating new ICs or something like that. I've run Autodesk Inventor on 24-CPU workstations. That's one of the few interactive programs that can usefully use a 24-CPU workstation. It's not a mass market product.

In a 1-2 years EVERY single new game will use 8 cpu cores by default.

Doubt it. Most game developers have not even figured out how to use more than 2GB of main memory or more than one core. I can't even think of a game that currently uses four cores. The next gen consoles have four, and thus that will be the norm for PC games as well for the next six to nine years.

Re:Lack of a use case (0)

Anonymous Coward | about 7 months ago | (#44915617)

Uh, no. That's the point. They have 8 cores, which means games will use at least 8 cores by default, and likely have the ability to use more, as higher-threaded game engines are often (nearly) arbitrarily threaded.

Re:Lack of a use case (1)

WaffleMonster (969671) | about 7 months ago | (#44915957)

Doubt it. Most game developers have not even figured out how to use more than 2GB of main memory or more than one core.

Game developers don't give a fuck about the CPU anymore. It is all GPU where hundreds to thousands of "cores" are in play.

Re:Lack of a use case (1)

locopuyo (1433631) | about 7 months ago | (#44916383)

Doubt it. Most game developers have not even figured out how to use more than 2GB of main memory or more than one core.

Game developers don't give a fuck about the CPU anymore. It is all GPU where hundreds to thousands of "cores" are in play.

Yes they do and no it isn't.
CPU cores are much faster than GPU cores so for things that can't be parallelized it is much faster doing the calculations on a CPU. There are no games that do the main physics and AI calculations on the GPU because most of that stuff can't be parallelized enough.
The only time something will perform faster on the GPU is when it can be parallelized into hundreds or thousands of calculations.

Re:Lack of a use case (1)

SuricouRaven (1897204) | about 7 months ago | (#44916587)

There are a few games that use the GPU for physics. Collision detection can be parallelized nicely. Not many though, simply because few games would see any benefit from it: You rarely have more than a handful of moving objects at a time, easily enough for the CPU to handle alone.

I wrote a mod for ut2k4 that uses CPU to calculate volumetric explosion simulations - due to the limited CPU time available it has to use a very crude model, but it's still better than the standard line-of-sight approach games move where you can hide from a nuclear bomb behind a well-placed lamppost. If games are to use more processing power for physics, that might be where it ends up: Simulating shrapnel from explosions and calculating pressure waves to more precisely calculate damage.

The mod really changes how grenades work in the game. They are of very limited effect in open space, but in a confined room or corridoor the effective blast range is much longer. It'll even travel out windows and around corners. I chose to simulate the 'hollywood fireball' rather than a physically accurate explosion, so it looks quite impressive. It basically runs a near-isotopic flood fill in three dimensions until a specified volume is reached, then spawns lots of conventional explosions within that volume.

Re:Lack of a use case (1)

K. S. Kyosuke (729550) | about 7 months ago | (#44917231)

There are no games that do the main physics and AI calculations on the GPU because most of that stuff can't be parallelized enough.

Since when can't physics and AI be parallelized? Have you looked around recently? The physics around you is heavily parallel, and so is your brain. (And quite a lot of computer-run AI stuff as well.)

Re:Lack of a use case (1)

ByronHope (2669333) | about 7 months ago | (#44916075)

Doubt it. Most game developers have not even figured out how to use more than 2GB of main memory or more than one core. I can't even think of a game that currently uses four cores. The next gen consoles have four, and thus that will be the norm for PC games as well for the next six to nine years.

Total War series has been using four cores for a number of years and I'm sure it's not the only game developed for the PC that does so.

Re:Lack of a use case (1)

Rockoon (1252108) | about 7 months ago | (#44916831)

The gaming market is only rudimentaly separating the workload into X number of threads -- anything else is a complexity nightmare for them -- sure many are now separating the physics stuff into one thread and the A.I. stuff in another, but they are not breaking up the A.I. into multiple threads nor are they breaking up the physics stuff into multiple threads.

Instead they are relying on the middleware frameworks ability to be more granular without their intervention, and the middleware just isnt designed for a specific number of cores. For example, PhysX doesnt care how many shaders your GPU has (or that you even have a GPU!) -- it just uses everything it can find -- the end metric isnt the number of cores available... its the number of gflops and the amount of bandwidth available.

..and now that things like OpenCL are near-universal (both AMD and Intel have multi-core SSE/AVX CPU drivers, and AMD, Intel, and nVidia all have GPU drivers) the entire concept of threading is going to go the way of the dodo. The specifics are abstracted away so that all that is left is a data-parallelism requirement on the design end and a glops and bandwidth requirement on the execution end.

Re:Lack of a use case (1)

grep -v '.*' * (780312) | about 7 months ago | (#44916193)

GTA V pretty much has that nailed and it runs on last-generation consoles.

Yeah, those old last generation consoles are just so ... ehhh, yesterday.

No -- wait! That really was yesterday, wasn't it?

(OK, so I've botched the [techradar.com] dates [techradar.com]. But it's funnier this way and besides you won't be reading this article in a month.)

A bit off base, IMO (2)

Just Brew It! (636086) | about 7 months ago | (#44915079)

We're hitting a wall on single threaded performance due to clock speed limitations, but CPU cores keep getting smaller and more power efficient. In a few years, we'll have the ability to put 32 or more cores in consumer CPUs, and it wouldn't surprise me if we have 8 core CPUs in smartphones and tablets. The key to continued performance improvements is better multi-threaded code, to allow us to effectively split up the workload across more cores.

Re:A bit off base, IMO (1)

Anonymous Coward | about 7 months ago | (#44915239)

The latest Exynos is an 8 ARM core processor : http://www.engadget.com/2013/09/11/exynos-5-octa-demos-8-cores-working-at-once-gpu-assist-and-ener/.

Looks like we're almost already there.

Re:A bit off base, IMO (1)

aiadot (3055455) | about 7 months ago | (#44915937)

There is a day and night difference in size requirements between a desktop class x86 core and an smartphone class ARM core. Desktop CPUs with 32 full featured cores are still quite far away in my opinion. And to be honest I'm not even sure if there is any advantage to that. At least for the next decade I can safely expect APU like processors with some big general purpose cores and tons of smaller task optimized cores(GPU cores for example). Even the exynos you posted is not a true 8 core ARM A15 processor but a mix of A15 and A7 cores.

ArLinux + FPGA is the perfect combo for entusiasts (0)

Anonymous Coward | about 7 months ago | (#44915207)

I'm posting using Ubuntu running on a zedbaord.com

wtf is an enthusiast? (0)

Anonymous Coward | about 7 months ago | (#44915325)

do you people really spend alot of money and time to build a single socket machine just because?

what a giant waste of time. you can build a really fat machine with enough money. you can also
program a novel scalable system on a $200 machine.

what goal does an enthusiast have?

Re:wtf is an enthusiast? (0)

Anonymous Coward | about 7 months ago | (#44915499)

To learn how to spell "a lot".

Latency. (1)

eriks (31863) | about 7 months ago | (#44915451)

As an "enthusiast", for me, it's almost all about latency. I want a system that responds as close to instantaneously as possible, especially for the stuff that really should be nearly instantaneous on modern hardware. These days, that means plenty of ram and a fast storage subsystem: SSD is the best upgrade I've done in years. I wait less. A 2 hour render is still a 2 hour render, but when I start up a heavy application I only wait 3 seconds instead of 10, or even 20. It just makes everything less frustrating, even 1 and 2 second waits can be really annoying if they happen a lot.

Many things are much better than they used to be, but I still say "hurry up" to my system too often, especially using a GUI. Though, my 3-year-old built-from-parts "enthusiast" machine feels faster to me than many newer commodity machines with better specs. "Tuning" things on the software side can make a difference, which is something that "enthusiasts" do, and want to be *able* to do.

So long as there are systems that can be tuned, streamlined and knocked about for fun, enthusiasts will be happy. Though I'm still searching for the "holy grail" of a GUI that never stutters, stalls or hiccups. Mostly, if you want that, you still have to use a command line.

Though I guess, if we ever get such a "holy grail" I may cease to be an enthusiast, since computing perfection will be a commodity.

Re:Latency. (1)

Anonymous Coward | about 7 months ago | (#44915645)

> Many things are much better than they used to be, but I still say "hurry up" to my system too often, especially using a GUI.

Throwing "Gnome" the hell out, and using a much older, less complex manager like "twm" or "vtwm" if you need multiple windows, saves roughly an hour a day when I'm working with Linux and a GUI.

Re:Latency. (3)

eriks (31863) | about 7 months ago | (#44915711)

Agreed. I usually go for XFCE on Linux, it's usually pretty snappy, though a snappy WM doesn't help with crufty chunky applications.

Enthusiast: there isn't a ring to rule them all (1)

Carlos Felipe Forigua Rodriguez (2978311) | about 7 months ago | (#44915459)

Let's face it. The uber duber turbine-sounded high end desktop doesn't get much of a use if you don't have some kind of time-management disorder or addiction. If you work or study you couldn't get much time on your precious anyway. If you work you get a console: Turn on and play and don't care of the price of games, you don't have much time anyway. If you run CFD simulations or something like it it's your employer problem to get you the tools you need. From my experience you just use another computer to do the heavy lifting while you make other tasks of your work on a now-retired-from-heavy-lifting workstation. If you like to program you can do it from any computer anyway. If you're a Open Source fan your favorite software would run on an 10 year old PC flawlessly. New generations prefer portability and use cellphones or tablets. They maybe would never see an open up desktop getting some upgrades like we did. The big companies also prefer it this way so they can sell more. From my case my next two upgrades will be a new impact and water resistant low end smartphone and a raspberry pi-like computer. The cellphone will be my calendar, mail-chat-google-internet machine and casual entertainment system (music, TV series, casual games). The raspberry-pi will be used as a server for a personal cloud and other home automation tasks. Maybe if I get a decent job I will think about going back to gaming. But I doubt if PC or console due to convenience. PC: very useful, but lifespan affected by use, dropping prices and new launches. Console: Long lifespan but not very useful and pricey games. As a final thought the next-gen consoles are as powerful as any mid-range desktop, not like in 2006 with the PS3 and Xbox 360, so even in the console niche there isn't any place for high end consumer computing. But in my current state of life a decent job is only a dream (I'm an MSc student and my country doesn't need my skills).

Another former good website trashed by ZiffDavis (1)

Vskye (9079) | about 7 months ago | (#44915611)

Really, I used to enjoy this website and watched it fall into the crap is it now after ZiffDavis acquired it.

Please never ever do this Slashdot. Oh..

From one PC Enthusiast.... (1)

Anonymous Coward | about 7 months ago | (#44915827)

To another, I can quite happily confirm that the enthusiast market is not dead quite the opposite it is thriving. The latest generation of games consoles has created a surge in interest from people interested in upgrading existing systems or purchasing new ones.

I build plenty of PC gaming machines and my build queue is full for the next 18 months. I've even had people interested in getting the best possible sound delivered from their machines for their HTPC setup. No, the enthusiast market is definitely alive and healthy but it is simply being overshadowed by market analysts who are only interested in mobile devices since that is where they perceive the growth factors to be however, once that market becomes saturated (which it is well on the way to doing) it will be up to traditional computing devices to save the market once again.

Tiny ARM Asic Chips (1)

ilikenwf (1139495) | about 7 months ago | (#44915927)

Such as the ones used in the wifi SDCards by Transcend and PQI SD cards.

Imagine a bunch of tiny cheap linux boxes to act as meshes, dead drops, micro servers, etc...and imagine how long they'd run on a battery, or even a battery with solar!

https://forum.openwrt.org/viewtopic.php?id=45820 [openwrt.org]
http://www.keyasic.com/keyasic_sub.php?type=information&inid=24 [keyasic.com]
http://hackaday.com/2013/09/19/advanced-transcend-wifi-sd-hacking-custom-kernels-x-and-firefox/ [hackaday.com]

Re:Tiny ARM Asic Chips (1)

SuricouRaven (1897204) | about 7 months ago | (#44916595)

Look up Piratebox. There are a bunch of people working on exactly that. With limited success - the tech is just about there, but the numbers aren't. You can't build a mesh without more people coming together locally.

no tablet could serve as my daw (2)

jsepeta (412566) | about 7 months ago | (#44916285)

my digital audio workstation runs Logic Pro X, Pro Tools 11, and Cubase 6.5. no tablet or phone can replace the desktop, which has not only several hard disks and lots of RAM, but an operating system capable of running plugins from a variety of 3rd party sources. I'm in no position to junk this thing for whatever might happen to be "hot" in the next couple of years, because enjoy working with older versions of software which are no longer supported. IOS comes close to OSX and Windows 7 as far as being able to run basic audio and midi recording, but the musical instrument industry still hasn't completely cracked the nut on integrating hardware and software instruments, providing a comfortable recording, mixing, and mastering workflow. to my knowledge, enthusiasts like myself will still be needing enthusiast computer hardware for the foreseeable future.

Necessity is the mother of invention... (1)

TheSeatOfMyPants (2645007) | about 7 months ago | (#44916693)

Real enthusiasts have always been the ones that wanted to really work with the hardware, whether the object was a car engine, vacuum-tube TV, or a computer. Fewer kids/adults developed the interest after the rise of "disposable" consumer culture, but from what I've been reading, that trend has slowly reversed as the weak economy started pushing more and more people to fix or improve whatever they can rather than blowing a bunch of cash on a replacement.

Personally, I've learned thus far that working with a soldering iron puts me into a great relaxed 'zen' state, and learning about PCBs & successful practice with the iron are both highly rewarding. If I succeed at fixing what I have on hand, I'll try to learn enough about electronics & programming to build kits, learn to modify them, and generally see how far I can go... As our economy continues to stagnate, more people/kids will find themselves at some point with a "broken" tech item that costs too much to replace at the drop of a hat, at least some will take the same route I have (particularly if they know someone else that already succeeded) and similarly become enthusiasts as well.

The formula is very very easy. (0)

Anonymous Coward | about 7 months ago | (#44917755)

interesting area requiring lots of processing power * computers w/ lots of processing power = profit!!!

breaking the equation down you find that any field requiring lots of processing power that is interesting these days is becoming rare. But what you could do is perform an Apple; 'Take something already ubiquitous, brand it, and make it social thereby cool. Then just ad scads of eyecandy to it and you now require tons of processing power GENIUS!!!. (as if the eyEPhone isn't already a supercomputer perfoming lots of GUI requests)' Or, sex. Or money.... Interestingly Bitcoin mining even if its an outdated notion would get attention if you could somehow turn it into a game. Or, perhaps visualize hashing? Man, if you could combine sex and coin mining you'll get people buying new PC's all day long.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...