Support for Intel HD3000+ graphics and future integrated video devices


#1

I feel High Fidelity should stress the importance of running seamless on Intel graphics 3000 series or higher. A lot of talented people cannot afford or do not use the latest computer technology and laptops are currently the majority platform in use. The Intel HD integrated series should have a specific graphic mode available to it and this mode should support future systems with the most basic capability. Does anyone else agree with me on this?


#2

These are strictly my own opinions. I have nothing to do with anything official, just been here a while and, maybe, have a better feel for the big picture.

I think the problem with this is High Fidelity is designing not for now but, instead, for the future. A future where something with performance like my Alienware laptop with GTX680M graphics is not uncommon for most users.

While I feel the pain of what this implies, the apparent goals of HiFi do not lend well to what is currently common but low end technology. I make a point of switching back to Intel HD4000 mode on my laptop from time to time and, in a simplistic build, it manages to do ok, but, it won’t be able to keep up much longer as builds get bigger, more detailed and HiFi adds additional GPU intensive effects to rendering engine.

While there may be some hope for “lesser” hardware in the future using distributed computations and remote render processing, the cold hard fact is, HiFi is building a next generation graphics experience, not a current or last generation one.


#3

The sad part of this is you lose some creativity from individuals whose primary line of work is not graphic design. It will be hard to interact with people trying to run HIFI on a less than average system if they are not given some means of guaranteed accommodation. A good example is a writer their only requirement is a laptop that types good documents and this is why I feel good support for the integrated video should not be ignored. For HIFI to be a success it will have to support the future of computer technology which may favor mobile computing and ULV technologies with minimal graphic capability. I don’t know how possible it would be with HIFI but to offload graphic processing to gaming servers for these users and stream their interface via wifi at the cost of some latency may be a solution?


#4

One thing to keep in mind is there’s no good feel for how far away the launch of HiFi as a production system is. If it were 2015/2016 I’d be more concerned with its need for more advanced computing power where if it’s more like 2020 I’d be less so. Moore’s Law always has been our friend and will continue to be for a least a few more years.


#5

Wise words as always Omega.

People in general don’t seem to realise this is a development platform and as such is built for the top technology available now - which in a couple of years will be even embedded into your phone.

As Alphas it’s our job to look forward with HF, not back.


#6

I agree however there are going to be a lot of younger people with really great ideas unable to participate in alpha/beta. I just read Samsung is considering buying AMD this would definitely improve mobile graphics for their portable devices.


#7

I look at this like I do Second Life for graphics. Intel Graphics will always be iffy, AMD graphics will be glitchy and nvidia will just generally work pretty well.

It is that way with almost any game or 3d environment I see lately, not just Second Life. Its why I hate seeing Intel Graphics loaded into so many machines that people buy expecting to be acceptable for 3d content. It shares your system memory and in general does not support what it should.


#8

Same problem with Open Wonderland. One guy outfitted his school with laptops for OWL clients and they were all Intel based graphics. No go, too bad, so sorry. They wouldn’t work, so he’s stuck with a room full of computers fit to play “Hunt The Wumpus” on. . I agree with Coal, that this IS Alpha and under powered laptops just are not up to the requirements of a 3D live virtual environment that is in a severe initial state of development. The devs are far less concerned with “creativity” at this stage then they are getting it to work at all. A couple of gigs of ram just will not cut it, either.


#9

If the goal is to create a ‘metaverse for all,’ then ALL should be able to use it.

Otherwise, it is a playground for geeks. While I fancy myself as having a permanent seat at the geek table, the beauty of most things is in its diversity.

I’m willing to bet that at some point there will be better compatibility, but these are my opinions.


#10

I speak personally here, not for High Fidelity etc, but here is how I look at it… Providing an experience for all has never meant every human or device could experience it.

It becomes a point we have to draw the line to make progress. You can have a piece of paper that is blank or one with lines, you can have college ruled, etc etc… It never ends. One piece of paper is just not meant for every use. Take the same piece of paper or notebook to every class in college and the professor in one or more of those classes will say “I am sorry, we just have different requirements here.”

Well, for High Fidelity to progress there has to be a point where the professor says “We need you to have this certain kind of computer/graphics card to access this system.”

High Fidelity hired men who used to work for Nvidia and Pixar so we could have beautiful graphics in the pipeline, some of the stuff we have not even dreamed of yet.

I myself am ready for the dream to come alive more and more and that is something that is inevitable. With innovation comes obsoletion.


#11

It’s not for all if all can’t experience. That IS what all means, lol.

But I do understand and respect your point. I am not expecting backwards compatibility here, but in order for this thing to grow, expand, and blossom the way we all hope, it has to at least accommodate ‘non-geek’ computers. Aside from the whole financial aspect, not everyone will be keen on purchasing an overkill pc to play a game- they didn’t with SL, they won’t here.

Intel graphics are standard even in high-end systems (and are annoyingly a thing with Macs, which in my industry is a standard)…if not supported, are we expecting to just say “whoever doesn’t have/can’t afford an nvidia card should go play checkers?”

Sounds a bit elitist.


#12

The long term aim is for the platform to be available to mobile users, so I do agree somewhat with your point. However in this Alpha stage they aren’t there yet, so I do sympathise with the HiFi team in terms of what they can support, in Alpha.


#13

I don’t speak for HiFI or know anything about what their plans are but I suspect that the bugs with Intel graphics might be a temporary thing. It used to work… not well, but you could use it (without shadows etc.).

As far as building Hifi as a platform, any company would be foolish to design their product for what is the norm at the time. In a year from now things will be different… later, even more so.

I am not aware of any high-end systems that use Intel graphics, unless you are talking about price rather than power and function.

ATI has not had a good record of updating their drivers, which is why I stopped using them years ago. And, as far as I know, Apple is still not the most common system that people use. Why won’t you support my Apple 2e machine and then Win 95 machine etc. used to be a common complaint.

But, like I said, as far as I know, they are trying to support everything that is common now. And maybe even tablet and mobile versions.


#14

I am curious, what are people using for systems? I am sure there is a wide range present. What about those of you using the 3D cameras, Oculus and sensor inputs? What is working with what well right now?

Thanks!


#15

While it is in Alpha stage, it is more a playground for “geeks” than not. I think that when HiFi hits beta stage, or better, it will accommodate less critical graphic requirements. Or, you will have better gear by then. One thing is for sure, if you do not have accelerated graphics, a rig suitable for gaming and a goodly amount of ram, you are going to have problems regarding anything 3D. Me, I am bandwidth challenged trying to make a HughsNet sat connection work. I get video but lousy, if any, audio. I live with that constraint. I hope it gets better in the future too, but I cannot rightfully say that HiFi has a problem and not me.


#16

Hi all,
While helping out on a bug, we need comments from Intel HD3000+ users.

Did anyone successfully launched the Interface.exe with Intel HD 3000 graphics or similar?

Here is the bug in particular.
https://worklist.net/20430

In my opinion, if High Fidelity is going to be popular, support for a common setup like Intel HD is a must.


#17

I am working on a basic i3 laptop with hd3000 graphics and 6gb of memory let me run some tests and see how it works out.


#18

Realisticaly, Intel video chips are not suitable for more than basic desktop usage, and even a web browser is pushing the limits of their capabilities. Even the modern ones can only cope with a 5 year old game because they offload most of the work to the CPU.
Expecting more than that is like buying a wheel barrow and trying to ride it on the motorway.

It’s just the nature of the beast, if you want to run any modern game then lowest common denominator components are not an option. A suitably powerful machine is still considerably cheaper than say a games console. Last year I picked up an 11" laptop with GeForce GT 335M for about £200. These machines do exist if you hunt them down.

Totally not being elitist. For a long time I’ve considered anything sold with integrated video not made explicity clear how low end it really is a massive con. Especially back when they were labelled ‘Intel Extreme’ It’s misleading marketting and am surprised that level of deception is even legal in any country.

I just wish there was a straightforward way of educating people in what not to buy so they can vote with they’re feet

Getting back on topic.

@Bill.Hifi, I gave it a try on a machine with a HD4500 and ULV I5 @2Ghz running Mint17, had no stability issues. But performance was terribad. I tried it with the same mesh in SL for comparison. It does perform comparatively better for me in that scenario

My issues are also with Satallite Internet related latency. But it’s infinitely better than my experience with SL where I had to patch in all sorts of hacks to make it usable.

On my workstation with GF760’s I actually appear to be cpu bound. I peg my cpu at ~22fps while the GPU is mostly idle. I tried pulling in a ship from Star Citizen to see what would happen. Just because it’s the most immediate thing with insane poly count I had to hand… The renderer managed it fine with no drop in my fps.

The renderer in it’s current state doesn’t seem to be that demanding GPU wise. I’d expect it to get better on the CPU when stuff gets less experimental and more optimized. That might in the long run help offset the pain of trying to function without a 3D chip. But I’d also expect the system capabilities of the content creators themselves to play a part in shaping what trends develop in scene complexities and poly counts. Seems there are far too many variables to really predict much of anything at this point.

If a majority of content creators are using low end kit. Then they’re naturally going to sway towards producing less resource intensive content. I guess what actually takes shape may be more representative of what the majority are running than any forum discussions may discern.


#19

I still remember the SL 2003/04 Prim hair flame wars that would errupt between those with systems who could handle it and those who could not.

I just hope we don’t see a new interation of that drama while trends stabilise.


#20

Good topic, one that adds to the questionability of opening the alpha at this stage, as there must be an influx of people who are now on systems that aren’t really suitable for even this early stage of development, and there may be insufficient management of expectations.

It must be pretty clear that alpha development must be targeting systems that can be expected to be common at least, say, two years from now. I jumped in on an iMac that has 4G and an Intel Core duo, but I also have a beefier Windows/Linux box. Targeting mobile as a baseline would be smart, IMO.

Looking forward, maybe way forward from the immediate now, it would be good to have a range of possible applications of HF in mind as development proceeds. There will be trade-offs, as always, on system capabilities and software capabilities. Maybe some graceful degradation can be built in from now, so that a wider range of applications remains possible on a wider range of hardware.