Sunday, October 14, 2007

Do You Want To Have Your Own Universe?

If you want to broadcast your messages to the whole world you can be builder of My PC Universe(or any another blog of MyUniverseRing). We are waiting for your contribution.

Contact us:my universe ring email       

 

Technorati Tags: , ,

Thursday, October 11, 2007

Physics Nobel Prize to Hard Drive Technology Pioneers

The Royal Swedish Academy of Sciences has awarded two men who discovered Giant Magnetoresistance effect in 1988. Albert Fert from France and Peter Grunberg from Germany have discovered GMR separately each from other.

This breakthrough permitted to create hard drives with much more density of information on the disks. Although Fert and Grunberg oppend GMR in 1988 there was no hard drives working on that effect till 1997.

GMR - Giant Magnetoresistance effect graph

The GMR effect was discovered thanks to new techniques developed during the 1970s to produce very thin layers of different materials. If GMR is to work, structures consisting of layers that are only a few atoms thick have to be produced. For this reason GMR can also be considered one of the first real applications of nanotechnology.

Thursday, March 22, 2007

Raytheon Develops World's First Polymorphic Computer

EL SEGUNDO, Calif., March 20, 2007 -- The world's first computers whose architecture can adopt different forms depending on their application have been developed by Raytheon Company (NYSE: RTN).

The architecture of the MONARCH processor with key elements identified
The architecture of the MONARCH processor with key elements identified

Dubbed MONARCH (Morphable Networked Micro-Architecture) and developed to address the large data volume of sensor systems as well as their signal and data processing throughput requirements, it is the most adaptable processor ever built for the Department of Defense, reducing the number of processor types required. It performs as a single system on a chip, resulting in a significant reduction of the number of processors required for computing systems, and it performs in an array of chips for teraflop throughput.

"Typically, a chip is optimally designed either for front-end signal processing or back-end control and data processing," explained Nick Uros, vice president for the Advanced Concepts and Technology group of Raytheon Space and Airborne Systems. "The MONARCH micro-architecture is unique in its ability to reconfigure itself to optimize processing on the fly. MONARCH provides exceptional compute capacity and highly flexible data bandwidth capability with beyond state-of-the-art power efficiency, and it's fully programmable."

In addition to the ability to adapt its architecture for a particular objective, the MONARCH computer is also believed to be the most power- efficient processor available.

"In laboratory testing MONARCH outperformed the Intel quad-core Xeon chip by a factor of 10," said Michael Vahey, the principal investigator for the company's MONARCH technology.

MONARCH's polymorphic capability and super efficiency enable the development of DoD systems that need very small size, low power, and in some cases radiation tolerance for such purposes as global positioning systems, airborne and space radar and video processing systems.

The company has begun tests on prototypes of the polymorphic MONARCH processors to verify they'll function as designed and to establish their maximum throughput and power efficiency. MONARCH, containing six microprocessors and a highly interconnected reconfigurable computing array, provides 64 gigaflops (floating point operations per second) with more than 60 gigabytes per second of memory bandwidth and more than 43 gigabytes per second of off-chip data bandwidth.

The MONARCH processor was developed under a Defense Advanced Research Project Agency (DARPA) polymorphous computing architecture contract from the U.S. Air Force Research Laboratory. Raytheon Space and Airborne Systems led an industry team with the Information Sciences Institute of the University of Southern California to create the integrated large-scale system on a chip with a suite of software development tools for programs of high value to the Department of Defense and commercial applications. Besides USC major subcontractors included Georgia Institute of Technology, Mercury Computer Systems and IBM's Global Engineering Solutions division.

Raytheon Space and Airborne Systems is the leading provider of sensor systems giving military forces the most accurate and timely information available for the network-centric battlefield. With 2006 revenues of $4.3 billion and 12,000 employees, SAS is headquartered in El Segundo, Calif. Additional facilities are in Goleta, Calif.; Forest, Miss.; Dallas, McKinney and Plano, Texas; and several international locations.

Raytheon Company, with 2006 sales of $20.3 billion, is an industry leader in defense and government electronics, space, information technology, technical services, and business and special mission aircraft. With headquarters in Waltham, Mass., Raytheon employs 80,000 people worldwide.

(c) www.shoutwire.com

Saturday, March 17, 2007

Google to Develop a Short Term Memory

by Marcus Yam

Google to make search logs anonymous after 18 to 24 months

Google

Google is changing its policies on storing information about its users. Each time a user conducts a search on Google, a database logs his or her keyword search, IP address and certain other bits of data stored in cookies. Currently, this information is stored indefinitely, but the new policy, which Google plans to implement over the next few months, will make the data slightly more anonymous to protect the privacy of its users.

“Previously, we kept this data for as long as it was useful,” Google officials said in statement. “Unless we're legally required to retain log data for longer, we will anonymize our server logs after a limited period of time.”

Google says it will remove the last eight bits of a user’s IP address 18 months to 24 months following the initial recording of information. All the bits before it, however, will remain intact and may still give authorities good indication on the original user. Even with the last eight bits of an IP address unknown, it is still possible to determine the approximate location and internet service provider of the user.

“Logs anonymization does not guarantee that the government will not be able to identify a specific computer or user, but it does add another layer of privacy protection to our users' data,” Google said to the media.

The U.S. government has been putting pressure on search companies to keep records of user activities in an effort to maintain national security. Privacy advocates, on the other hand, are pushing in the opposite direction and lobby for companies such as Google to maintain no records at all.

“By anonymizing our server logs after 18-24 months, we think we're striking the right balance between two goals: continuing to improve Google's services for you, while providing more transparency and certainty about our retention practices,” the Google statement said.

(c) www.dailytech.com

Uni.Technology Tags: ,

Friday, March 16, 2007

Intel suffered but AMD surged in 2006, iSuppli says

Press release, March 16; Rodney Chan, DigiTimes.com [Friday 16 March 2007]

It was a tale of two companies in the semiconductor industry in 2006, with leading chip supplier Intel suffering a revenue decline, while rival AMD nearly doubled its sales, according to iSuppli.

"For US microprocessor giant Intel, 2006 was the worst of times, as its global semiconductor revenue dropped by 11.1% from 2005," said Dale Ford, vice president of market intelligence for iSuppli. "The revenue decline, which was due to Intel's bleak performance in its core PC microprocessor and flash-memory businesses, erased nearly all of the company's sales gains from its strong year in 2005. Intel's 2006 revenue of US$31.5 billion was less than half a percentage point higher than its sales in 2004."

"For Intel's smaller US rival, AMD, 2006 was the best of times as it achieved a whopping 91.6% increase in revenue for the year, partly due to a major acquisition, but also because of strong gains in microprocessor market share," Ford added.

This robust increase in revenue caused AMD's ranking to rise to eighth place in 2006, up seven positions from the 15th rank in 2005.

The divergent performances of Intel and AMD came during a 2006 when global semiconductor industry revenue rose by 9.3% to reach US$260.2 billion, up from US$237.98 billion in 2005. This is slightly higher than the 9% growth iSuppli predicted in its preliminary market share estimate compiled in November and released in early December.

Intel in 2006 faced hard times in its microprocessor and flash-memory businesses, which together accounted for 83% of total company revenue last year. The company's combined microprocessor and flash revenue in 2006 fell to its lowest level since 2003 as Intel faced rising competitive pressure in those markets. The revenue decline resulted in Intel's market share falling to 12.1 percent, its lowest level since before 2000.

Meanwhile, AMD in 2006 gained PC microprocessor market share at Intel's expense. AMD's PC microprocessor revenue rose by 35.5% in 2006 and its market share in that product segment increased to 16.1%, up 5 percentage points from 11.1% in 2005.

AMD's revenue also was boosted substantially by its acquisition of graphics chip seller ATI Technologies in 2006.

Looking beyond Intel and AMD, 2006 was a banner year for the leading pure-play memory chip suppliers.

Memory supplier Hynix Semiconductor of South Korea leapt to the seventh-place position in 2006, up from 11th in 2005 as its revenue surged by an impressive 41.5%. Hynix's memory revenue growth of US$2.3 billion surpassed the US$1.8 billion memory sales increase posted by memory-chip leader Samsung Electronics in 2006.

Germany's Qimonda, a newly created pure-play memory company formed from the spin-off of Infineon's memory business, increased its revenue by 54.9% in 2006.

However, the fastest growing memory supplier in 2006 and the quickest expanding supplier among the world's top-25 chip sellers-was Japan's Elpida Memory. Elpida's revenue nearly doubled in 2006, rising by 98.6% from 2005. This caused the company's ranking to rise to 19th in 2006, up from 28th in 2005.

Memory ICs were the key segment driving the growth of the overall semiconductor industry in 2006, with revenue in this area rising by 22.7%. A stronger-than-anticipated revenue increase in the fourth quarter boosted annual growth for DRAM to 35.2% in 2006.

(c) www.digitimes.com

Uni.Technology Tags: , , , , ,

Thursday, March 15, 2007

NVIDIA GeForce 8600-Series Details Unveiled

by Anh Huynh

NVIDIA prepares its next-generation mid-range and mainstream DirectX 10 GPUs

Earlier today DailyTech received it's briefiing on NVIDIA’s upcoming GeForce 8600GTS, 8600GT and 8500GT graphics processors. NVIDIA’s GeForce 8600GTS and 8600GT are G84-based GPUs and target the mid-range markets. The lower-positioned G86-based GeForce 8500GT serves as the flagship low to mid-range graphics card.
The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and PureVideo technologies.


NVIDIA GeForce 8600GTS

 


NVIDIA GeForce 8600GT

NVIDIA touts three dedicated video engines on the G84 and G86-based graphics cards for PureVideo processing. The video engines provide MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p. G84 and G86 support hardware accelerated decoding of H.264 video as well; however, NVIDIA makes no mention of VC-1 decoding. G84 and G86 also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.
At the top of the mid-range lineup is the GeForce 8600GTS. The G84-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.
GeForce 8600GTS-based graphics cards require an eight-layer PCB. Physically, the cards measure in at 7.2 x 4.376 inches and available in full-height only. NVIDIA GeForce 8600GTS graphics cards feature a PCIe x16 interface, unlike ATI’s upcoming RV630. GeForce 8600GTS-based cards still require external PCIe power. NVIDIA estimates total board power consumption at around 71-watts.
Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs. G84-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for HDMI.
NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.
GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.
The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.
NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.
Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on AMD’s upcoming RV630 and RV610.

(c)  www.dailytech.com

Intel to Sample Phase Change Memory This Year

by Marcus Yam

Intel claims it will mass produce phase change memory before the end of 2007

This week Intel privately shared parts of its roadmap for memory technologies through 2008. Intel’s progress on phase-change memory, PCM or PRAM, will soon be sampled to customers with mass production possible before the end of the year.

 


Phase change memory wafer manufactured at 90nm

Phase-change memory is positioned as a replacement for flash memory, as it has non-volatile characteristics, but is faster and can be scaled to smaller dimensions. Flash memory cells can degrade and become unreliable after as few as 10,000 writes, but PCM is much more resilient at more than 100 million write cycles. For these reasons, Intel believes that phase-change memory could one day replace DRAM.

“The phase-change memory gets pretty close to Nirvana,” said Ed Doller, CTO of Intel’s flash memory group. “It will start to displace some of the RAM in the system.”

For its implementation of phase-change memory, Intel has since 2000 licensed technology from Ovonyx Inc.. The Ovonyx technology uses the properties of chalcogenide glass, the same material found in CD-RW and DVD-RW, which can be switched between crystalline and amorphous states for binary functions.

Every potential PCRAM memory maker thus far licenses Ovonyx technology. According to Ovonyx’s Web site, the first licensee of the technology was Lockheed Martin in 1999, with Intel and STMicroelectronics in the following year. Four years after that, Nanochip signed an agreement. Elpida and Samsung were the next two in 2005, and Qimonda marks the latest with a signing this year.

IBM, Macronix and Qimonda detailed last December its recent developments on phase-change memory. Researchers at IBM’s labs demonstrated a prototype phase-change memory device that switched more than 500 times faster than flash while using less than one-half the power to write data into a cell. The IBM device’s cross-section is a minuscule 3 by 20 nanometers in size, far smaller than flash can be built today and equivalent to the industry’s chip-making capabilities targeted for 2015.

Intel’s initial phase-change technology, however, is already a reality, as the chipmaker revealed that it has produced a 90 nanometer phase-change memory wafer. At the 90 nanometer process size, the power requirements to write are approximate to that required for flash. Intel said that its early test work shows data retention abilities of greater than 10 years even at temperatures of 85 degree Celsius.

Intel touts PCM as a “new category of memory,” as its attributes are distinctly different, and typically superior to many of the memory technologies today as it combines the best attributes of RAM, NOR and NAND. Intel wouldn’t give a firm date on the availability of its phase-change memory as several details still need to be finalized after the sampling process.

“We're going to be using this to allow customers to get familiar with the technology and help us architect the next generation device.” Doller said. “We're hoping we can see [mass] production by the end of the year, but that depends on the customers.”

(c) www.dailytech.com

 

Uni.Technology Tags: , , ,

Thursday, February 22, 2007

The Future of HDMI

by Tuan Nguyen

Is 2007 the year of the display format wars? A look at the licensing structures of these formats reveals more

HDMIUnfortunately, consumers will be faced a total of three display standards in 2007 -- and even more in 2008. Along with HDMI, computers will start to ship with DisplayPort and the Universal Display Interface (UDI) this year.  UDI is electrically compatible with DVI and HDMI, but does not carry the same licensing fees as either and has a stripped down feature set.  DisplayPort is not compatible with any existing signaling format.
One of the primary concerns for these new standards is cost and interoperability.  Expensive HDMI and HDCP certification is cited as one of the culprits delaying AMD 690G motherboards.
High fidelity signaling backers are split into two licensing camps: one supporting the DVI-derivatives (DVI-HDCP, HDMI, UDI) and the other supporting DisplayPort. AMD, Dell, Genesis Microchip, Hewlett-Packard, Molex, NVIDIA, Philips, Samsung and Tyco Electronics are supporters of DisplayPort; Hitachi, Panasonic, Philips, Sony, Silicon Image, Thomson and Toshiba compose the primary backers of HDMI. A significant portion of the DisplayPort supporters also have interests in HDMI.  Earlier last year, several manufacturers including Sapphire and PowerColor announced HDMI-enabled graphics cards based on ATI GPUs. MSI also announced HDMI cards based on NVIDIA GPUs.
When DailyTech asked why HDMI was taking a long time to appear in PC products, Leslie Chard, president of HDMI Licensing LLC, said "Right now most manufacturers are considering the cost of adding HDMI to their graphics products. Since HDMI is based mainly on DVI signals, the technology is already available in graphics processors. HDMI is everywhere -- consumer electronics, home entertainment and now companies are demanding the technology for smaller handhelds. You can't beat HDMI's cross platform compatibility."

Joe Lee, director of marketing for Silicon Image, added "Card manufacturers now only have to consider ways of grabbing the sound output through the PCI Express bus and adding the cost of the physical connector. If card manufacturers can finish writing the special [drivers] needed to grab the audio, everything would be set. Windows Vista should help drive HDMI forward."
According to initial reports, DisplayPort was heralded as a royalty-free technology. As it stands today, DisplayPort is royalty free but is composed with well over 200 patents. According to VESA, the committee that overlooks over the DisplayPort standard, the intellectual property (IP) holders are not held fixed and can and may charge a "reasonable" fee for the technologies used in DisplayPort.
Chard took a shot at DisplayPort, claiming "These IP holders are free to charge royalties under RAND [Reasonable and Non-Discriminatory] terms.  Until these IP holders make a public commitment, manufacturers have no idea what this rate will be.  Moreover, additional IP holders may come forward and charge additional royalties in the future; this is especially true if the DisplayPort standard ever evolves to incorporate advanced new technologies." 
HDMI's fees are already disclosed -- $0.04 per product and a small minimal fee for the HDCP keys, if used. HDMI Licensing LCC reduced the fees associated with using the technology late last year.
The largest hurdle DisplayPort faces, besides getting out the door, is interoperability with other devices.  DisplayPort is not compatible with HDMI, UDI or DVI.  The hurdle in jumping from one signaling protocol to the other is that the DVI-derivative protocols use HDCP, DisplayPort uses DPCP and HDCP.   VESA partners claim they will develop devices that allow HDMI to DisplayPort conversion, though doing so would mitigate DPCP.  Lee points out that this is essentially against the whole principle of a content protection protocol in the first place: if someone can freely negotiate between multiple or non-existent protocols that aren't under the same certification umbrella, then why have a certification process at all?
It has not been disclosed yet as to whether or not DisplayPort implementers may be required to pay royalties for the HDCP and Display Port Content Protection (DPCP) conversion either.
As of right now, the consumer electronics playing field is blanketed with HDMI-enabled products. The technology also recently entered its 1.3 revision, supporting features such as higher resolution and deep-color (wider color gamut) -- Sony's PlayStation 3 supports HDMI 1.3.  Philips, the inventor of DisplayPort's content protection scheme DPCP, recently announced a wireless version of HDMI.
AMD is expected to launch DisplayPort compatible GPUs later this year with NVIDIA opting for the standard as well. Early last year, Silicon Image stated that UDI will end up replacing both HDMI and DVI standards on the PC when it becomes available to reduce licensing fees, though it will still be compatible with the older standards.  

(c) www.dailytech.com

Wednesday, January 10, 2007

CES 2007: Pioneer demos the future of plasma

Pioneer made some bold claims on Sunday about its brand spanking new plasma technology that should shoo of those pesky LCD and SED screens that have been sniffing around its turf. We were taken into a back room of the the Pioneer to witness the spectacle. Here, Pioneer had stood two 60" TV side by side, one using the current system, and the new one next to it.

The first me thing that struck me was that while the TVs sat idle, with nothing showing on the screen, you could still see the faint glow of luminance on the older tech one. The new plasma screen, however, actually offers such dark blacks that you can't tell whether it is showing a blank screen or has been switched off.

We kicked things off with a demo movie designed to show off all of its new features, such as the contrast ratio (which apparently is so good it can't be measured with standard equipment) and the richer colours. Then it was time for the side by side comparison and it became clear quite how much difference the new system makes. The colours are so much more vivd, skintones are a lot more distinctint and of course blacks look quite considerably darker.

To demonstrate how much better the new contrast works, the lights were turned up and down, and it was only when lights came up quite a long way that you could even begin to see any effect on the new screen.

One thing I did notice was that the new plasma was perhaps a little too black, and the colours a little too deep, which meant that the demo started to gain an unreal quality to it. Of course, you're unlikely to ever watch anything that even remotely resembles what they showed on the demo, so it might be best to take the results of this comparison with a pinch of salt. However, as it currently stands, it looks like Pioneer has delivered on its promises.

(c) www.techdigest.tv

Westinghouse Quad HD on display


In the all hype that built up before CES this year we were really excited to check out the Quad HD display from Westinghouse. We went by the Westinghouse booth to check it out, the odd thing was the display was in the back of the booth and not marked very well. After receiving some help we found the display with some scientific data showing. Other than industrial uses we're not sure what we would use this 52" display with a resolution of 3840 x 2160. It would take one good scaler to scale the image that big.

(c) www.engadget.com

Nakamichi Kimono LCD display

Nakamichi Kimono LCD display

CES 2007 - Nakamichi is well known for its line of home entertainment systems (especially those on the audio front), and this 42" LCD display comes with a hand-lacquered display which shimmers like the finish of a grand piano (read: fingerprint magnet). According to Nakamichi, the Kimono LCD display is pretty hardy as the 7-coat lacquer bezel is more than capable of standing against the test of time while retaining its beauty. You get the full 1080p experience as well as SRS TruSurround multi-channel audio which ought to keep you entertained for hours on end. The design looks rather weird to me, but it was still enough to pick up the Design and Engineering Showcase Award for a Video Display. I suppose I don't have a knack for appreciating abstract art. What do you think?

(c) www.ubergizmo.com

Sunday, January 7, 2007

Meccano to unveil WiFi-enabled Spyke Robot Set

Meccano is all set to unveil a WiFi-controlled, Erector-branded robot kit at CES. The Spyke certainly improves on Meccano's earlier metallic attempts at robot sets: this little fella is capable of feeding a webcam video stream to a PC over the aforementioned wireless connection, as well as climbing stairs with its triangular tank-track; that's right parents, your staircase can no longer keep your kids safe. These basic specs combined with the teaser image on the right should be enough to perk up the ears of all the consumer robot enthusiasts out there -- who are now no doubt waiting to hear how competitively priced the Spyke will be in comparison to Lego's Mindstorm series of robot kits -- but unfortunately there's no other information (availability, pricing, etc.) about the Spyke as of yet. Don't fret though, because when CES kicks off in a couple of days, all will be revealed.

(c) www.engadget.com

Norcent readies new plasmas / LCD HDTV for CES

by Darren Murph

While it seems we really only hear from Norcent when CES is approaching, the California-based outfit is kicking it up a notch this year as well, as it plans to showcase a duo of new plasmas and a flagship LCD HDTV as well. Both PDPs will feature integrated ATSC / NTSC tuners, HDMI, component / S-Video / composite, VGA, 160-degree viewing angle, SRS-enhanced stereo speakers, and a "3D digital comb filter" for color processing. Additionally, the 50-inch PT-5045HD plasma will sport a 1,366 x 768 resolution, 1,000 cd/m2, and a 10,000:1 contrast ratio, while the 42-inch PT-4246HD gets stuck with a 1,024 x 768 resolution and 8,000:1 contrast ratio, but picks up a Clear-QAM tuner and a 1,200 cd/m2 brightness rating. Over on the LCD side, the 37-inch VION LT-3790 one ups the LT-3725 we saw just recently by touting a larger panel, 1,366 x 768 resolution, 1,000:1 contrast ratio, 500 cd/m2 brightness, 176-degree viewing angle, built-in ATSC / NTSC (Clear-QAM) tuners, HDMI, component / S-Video / composite inputs, VGA, and the same SRS-enabled speakers as on the plasmas. For those interested in picking one up, the PT-4246HD is on the streets right now for a very reasonable $1,199.99, while the 50-inch flavor will land next month for $1,899.99, and the $1,199.99 VION LCD will hit stores sometime Q2 2007.

(c) www.engdaget.com

Saturday, January 6, 2007

GE90-115B Gas Turbine Jet Engine Testing & Evaluation

Toshiba unveils world's first HD DVD writer

toshibaHDDVD.jpg

Always expect firsts from Toshiba America, the maker of computers/storage/projectors and so much more. The company announced the world's first HD DVD burner for desktop computers. (Wondering what HD DVDs are? Catch up on a story I wrote last year)

We knew this was coming. High-def DVD players has been available for since April 2006 (mostly thanks to Toshiba's computer and consumer electronics divisions). Toshiba says more than 1.5 million HD DVD movies have been sold. But who can resist using the same discs for storage purposes? Since high-definition video needs oodles of gigabytes, the discs can hold 30 GBs of digital files (that's approximately up to five full-length standard DVD films, up to 7,500 MP3 songs or up to 30,000 high-quality images, according to Toshiba).

The SD-H903A internal drive will be sold to computer companies and manufacturers beginning in February. The good news for consumers, we may start seeing PCs with HD DVD burners in the spring or summer.

Other specs:
It only writes HD DVD content in real time (that's 1x speed)
Also compatible with all older DVD and CD formats.

(c) blogs.ocregister.com

Three Intel quad-cores coming Monday

http://www.idc.com.tw/Event/InfrastructureVision2005/Intel%20logo.gifIntel plans to launch three quad-core processors on Monday, covering two Xeons for lower-end servers and one mainstream model for desktop computers, sources familiar with the plan said.

As expected, the desktop chip is called the Core 2 Quad 6600 and will join the Core 2 Extreme QX6700 model Intel already ships. The new processor will run at 2.4GHz, and the front-side bus that links the chip to the rest of the system will run at 1066MHz, the company is expected to announce at the Consumer Electronics Show in Las Vegas next week.

Also set to arrive are two low-end Xeons, the 2.13GHz 3210 and 2.4GHz 3220. Both are designed for single-processor servers. The chips have 8MB cache and a 1066MHz front-side bus.

Intel declined to comment for this story.

The chipmaker began its quad-core product launch in November but now is fleshing out the lineup. It often launches desktop products with extreme models geared for demanding video game systems, then adds more moderately priced mainstream models later.

"I expect, with respect to the desktop quad-core, it's mostly a matter of maintaining a certain cadence, even if, practically speaking, there won't be a whole lot of near-term buyers," Illuminata analyst Gordon Haff said.

Intel's quad-core processors combine two dual-core chips into a single package. Rival chipmaker Advanced Micro Devices has a quad-core processor code-named Barcelona under development that puts all four cores on a single slice of silicon. However, that chip won't arrive until midway through this year.

Servers, which often juggle multiple independent tasks, are well-suited to taking advantage of multiple processing cores. With desktop machines, however, the benefits aren't as clear because software often isn't able to use all the cores effectively.

(c) news.com.com

Toshiba Designing New Reactor

http://www.stegen.com/images/toshiba_logo.jpgJapan's Toshiba Corp. (6502.TO) said Friday it was independently designing a new boiling-water reactor, or BWR, for use in nuclear power plants.
"We are making a preliminary conceptual design for a next-generation" BWR, a company spokesman said.
Toshiba, a Tokyo-based company whose businesses run from consumer electronics to large industrial infrastructure, has developed its nuclear reactor business together with General Electric Co. (GE) for over 40 years.
But in November, GE agreed to integrate its nuclear operations with Hitachi Ltd. (6501.TO), fueling speculation that GE and Toshiba could eventually become rivals. A report Friday in The Nikkei said Toshiba would market a BWR on its own by 2015.
The Toshiba spokesman said the new reactor was being chiefly designed for the domestic market, as the Japanese government was planning to upgrade its current generation of reactors from around 2030.
"It is too premature to discuss competition" with GE, he said, adding that the two companies are still jointly developing boiling water reactors.

(c) neinuclearnotes.blogspot.com

Here comes the terabyte hard drive

http://www.engadgethd.com/images/2005/07/Hitachi_logo.jpgLast year, Hitachi Global Storage Technologies predicted hard-drive companies would announce 1 terabyte drives by the end of 2006. Hitachi was only off by a few days.

The company said on Thursday that it will come out with a 3.5-inch-diameter 1 terabyte drive for desktops in the first quarter, then follow up in the second quarter with 3.5-inch terabyte drives for digital video recorders, bundled with software called Audio-Visual Storage Manager for easier retrieval of data, and corporate storage systems.

The Deskstar 7K1000 will cost $399 when it comes out. That comes to about 40 cents a gigabyte. Hitachi will also come out with a similar 750GB drive. Rival Seagate Technology will come out with a 1 terabyte drive in the first half of 2007.

deskstar_7k1000

The two companies, along with others, will tout their new drives at the upcoming Consumer Electronics Show in Las Vegas, and will show off hybrid hard drives, as well.

A terabyte is a trillion bytes, or a million megabytes, or 1,000 gigabytes, as measured by the hard-drive industry. (There are actually two conventions for calculating megabytes, but this is how the drive industry counts it.) As a reference, the print collection in the Library of Congress comes to about 10 terabytes of information, according to the How Much Information study from U.C. Berkeley. The report also found that 400,000 terabytes of e-mail get produced per year. About 50,000 trees would be necessary to create enough paper to hold a terabyte of information, according to the report.

Who needs this sort of storage capacity? You will, eventually, said Doug Pickford, director of market and product strategy at Hitachi. Demand for data storage capacity at corporations continues to grow, and it shows no sign of abating. A single terabyte drive takes up less space than four 250GB drives, which lets IT managers conserve on computing room real estate. The drive can hold about 330,000 3MB photos or 250,000 MP3s, according to Hitachi's math.

Consumers, meanwhile, are gobbling up more drive capacity because of content like video. An hour of standard video takes up about 1GB, while an hour of high-definition video sucks up 4GB, Pickford said.

Consumers, though, tend to be skeptical of ever needing more storage capacity.

"We heard that when we brought out 1 gigabyte drives," Pickford said.

The boost in capacity for desktop drives comes in part through the introduction of perpendicular recording technology to 3.5-inch-diameter drives. In perpendicular drives, data can be stored in vertical columns, rather than on a single plane. Drive makers have already released notebook drives, which sport smaller 2.5-inch-diameter drives, with perpendicular recording. The 1 terabyte drives will be Hitachi's first 3.5-inch drives with perpendicular recording.

Currently, Hitachi sells 3.5-inch drives that hold 500GB of data, while Seagate has come out with a 750GB data drive.

Drive makers convert to perpendicular recording when the need for areal density, the measure of how much data can be crammed into a square inch, passes 125 gigabits. The terabyte drive (and the 750GB drive) can hold 148 gigabits per square inch, or 148 billion bits. Hitachi's previous 3.5-inch drives maxed out at 115 gigabits per square inch.

The hard drive turned 50 last year, and over the past five decades data capacity has increased at a fairly regular and rapid pace. The first drive, which came with the RAMAC computer, weighed about a ton and held 5MB of data.

Hard-drive scientists say that increases in capacity will continue because of technologies like heat-assisted recording and patterned media.

(c) www.netscape.com

Friday, January 5, 2007

Samsung announces double-sided LCDs

by Ken Fisher

If two heads are better than one, are two faces on a single LCD a no-brainer? Samsung today announced that it has created thin-film transistor, liquid crystal display (TFT-LCD) panels capable of showing independent images on both sides of an LCD screen. Unlike other two-faced LCDs which only show an image and its reverse on the flipside, the Samsung solution can display two different images. According to Samsung, the dual-sided LCDs have been designed primarily for use in mobile products. One obvious application would be cell phones that currently have two separate displays.

Is this a battery-life nightmare in the making? Samsung says no. Although there are two visual surfaces, the new dual-sided LCD uses one backlight. One side of the LCD also helps out with illumination, using the light trapped by one side's "transmissive mode" projection to illuminate the reflective side. Transmissive LCDs are those which are actively illuminated from behind LCDs cells, and make up most LCDs that we look at these days. Reflective LCDs are just the opposite: they rely on ambient light for illumination, and therefore are less clear and offer less contrast than transmissive LCDs. You've seen these on digital watches or on the exteriors of many flip phones (some have backlights that can be activated).

How do you display two images on a single LCD? Normal LCDs have TFT gates that each operate a single pixel, but Samsung's dual-sided LCDs put two control gates on each pixel, allowing it to control each side independently without having to double the circuitry. Complex calculations take care of the rest.

Samsung's dual-sided LCD panels

Samsung will begin mass production of a 2.6mm thick and 2.22" wide QVGA (240 x 320 pixel) dual-sided panel in the first half of this year. The company said it will show off the display at CES, so we'll be sure to check it out and ask them if they expect this technology to ever make its way outside of mobile devices.

(c) www.arstechnica.com

Thursday, January 4, 2007

Samsung to unveil large LCD TVs

by Eric A. Taub

http://www.whisperbrand.com/blog/i/samsung_logo.jpgSamsung, the world's largest seller of televisions, will introduce at next week's Consumer Electronics Show a new line of rear-projection liquid crystal display televisions that will not be much thicker than flat-panel TVs but will cost about 30 percent less.

The new sets, aimed at those who want a bigger set but cannot afford a plasma TV, will come in 50- to 60-inch sizes. At 10 inches deep, they can be hung on a wall.

While consumers are embracing LCD flat-panel TVs in ever-larger sizes, Samsung has also recommitted itself to plasma TV and will double its current plasma production capacity later this year, said G.S. Choi, the company's president for digital media, in an interview from South Korea.

Plasma will continue to dominate in larger sizes, and the company will add an 80-inch model to its lineup this year, Choi said. To help consumers fit these larger sizes in their homes, Samsung said it was redesigning the TV frames to make them up to 30 percent thinner.

At the electronics show, held annually in Las Vegas, Samsung will also introduce a wireless plasma TV that will be able to receive HDTV programming sent from an HD DVD or Blu-ray player or set-top box. Because there will be no cables to hide, consumers may be more likely to hang such a plasma display on a wall.

The company, which was the first to introduce a Blu-ray high-definition DVD player last year, will announce its second-generation player next week. The new model is expected to cost about 20 percent less than the current version, but will have more interactive functions.

Although LG will introduce a dual format HD DVD/Blu-ray player, Samsung has no plans to do the same. "If the market is still divided, we could do a dual-format player, but we will wait and see," Choi said.

Entire contents, Copyright © 2007 The New York Times. All rights reserved.

LG to Launch Dual-format Blu-ray Disc and HD DVD Player

http://www.lgelectronics.co.nz/logos/LG%20LOGO%20tricovert.jpgIt was bound to happen: In the mess known as the high-def format wars, eventually, it was clear a manufacturer would cross party lines and release a single player capable of handling both Blu-ray Discs and HD DVD discs. The question wasn't if; it was a question of when.

LG Electronics is the first to cross that line: The company has just announced it will be launching the first dual-format high-definition disc player at the Consumer Electronics Show in Las Vegas next week. The LG press release issued in Korea early Thursday morning is short on details--the player will launch in "early 2007", but beyond that, we have no information on pricing--but that's of little consequence for now. I imagine the details will become clear by Sunday, when LG holds its press conference at CES.

LG stated it was considering a dual-format player at the CeBIT show last March, but the company has been quiet about its progress until now. The company is the first to formally announce a dual-format player; prior to this, Ricoh and NEC had both announced they had developed components that could read both Blu-ray and HD DVD media, but neither had announced actual products. Samsung had also made rumblings about coming out with a dual-format player, but the company backtracked on those reports early last year.

The LG announcement dramatically alters the competitive landscape for Blu-ray Disc and HD DVD. The mere announcement of a dual-format player could stall the market for high-definition players and discs, as consumers anticipate the dual-format player's arrival. A dual-format player would offer consumers a hedge against obsolence, in the event one of the disc formats dies out over time.

And once the dual-format player does come out, it could ignite the market for high-definition players and discs, a market that's still in its infancy.

Price will likely play a big role, though, in the dual-format player's success. If the player is expensive--and, certainly, I expect it to carry a premium over a standalone player, at least at launch--its high price may deter consumers from buying right now. However, if the dual-format player's premium is an acceptable one to consumers, then the player could take off--in turn driving consumers to buy movies in high-definition, without having to worry about which studios are backing which disc format.

However, while a dual-format player will help consumers worried about buying into the wrong format, it won't help content producers. Dual-format players will remain a rarity, for at least the next year. If dual-format players do become the norm, studios will be faced with a quandary: Continue to support both formats, a costly endeavor, or release new and catalog content in just one of the disc formats--thereby foregoing support of those early adopters who bought into whichever disc format falls by the wayside.

Are you itching to buy a high-def disc player? Does news of a dual-format player make you more likely to buy a high-def disc player in the next year?

(c) blogs.pcworld.com

Wednesday, January 3, 2007

Samsung Samples 50nm 16Gb NAND Flash

Samsung inches closer to making SSDs more mainstream

When it comes to storage technology on computers, hard drive technology has advanced the slowest as far as performance is concerned. Companies like Samsung are looking to Flash Solid State Disks (SSDs) to replace the spinning disk and reduce loading times for applications.

SSDs have the advantage of rapid response times without having to wait for a hard drive to spin up/seek and have drastically reduced power consumption compared to traditional hard drives. SSDs use zero watts when not being accessed, and as little as 200 milliwatts during read/write activities.

Given the lower power requirements, company’s like Sony and Fujitsu are looking to Samsung to provide SSDs for their mobile computers. Samsung also uses its SSD drives on the Q30 notebook and Q1 UMPC.

Samsung announced today that it has produced samples of the world's first 16Gb NAND flash memory device built on a 50 nanometer process. The multi-level cell (MLC) design uses a 4KB page size instead of the 2KB used in competing designs. As a result, read speeds are double that of 2KB designs while write speeds are increased by 150%.

The increased storage capacity and faster write speeds will help Samsung reach its goal of producing 128GB SSDs by the first half of 2008.

Samsung will begin mass production its new MLC 16Gb NAND flash memory chips in Q1 2007.

(c) www.dailytech.com

Cheaper LEDs from breakthrough in ZnO nanowire research

P-type ZnO Nanowires

SEM image of p-type ZnO nanowires created by electrical engineering professor Deli Wang at UC San Diego . Note: the blue color was added in photoshop. Credit: Deli Wang/UCSD

Engineers at UC San Diego have synthesized a long-sought semiconducting material that may pave the way for an inexpensive new kind of light emitting diode (LED) that could compete with today's widely used gallium nitride LEDs, according to a new paper in the journal Nano Letters.

To build an LED, you need both positively and negatively charged semiconducting materials; and the engineers synthesized zinc oxide (ZnO) nanoscale cylinders that transport positive charges or "holes" – so-called "p-type ZnO nanowires." They are endowed with a supply of positive charge carrying holes that, for years, have been the missing ingredients that prevented engineers from building LEDs from ZnO nanowires. In contrast, making "n-type" ZnO nanowires that carrier negative charges (electrons) has not been a problem. In an LED, when an electron meets a hole, it falls into a lower energy level and releases energy in the form of a photon of light.
Deli Wang, an electrical and computer engineering professor from UCSD's Jacobs School of Engineering, and colleagues at UCSD and Peking University, report synthesis of high quality p-type zinc oxide nanowires in a paper published online by the journal Nano Letters.
"Zinc oxide nanostructures are incredibly well studied because they are so easy to make. Now that we have p-type zinc oxide nanowires, the opportunities for LEDs and beyond are endless," said Wang.
Wang has filed a provisional patent for p-type ZnO nanowires and his lab at UCSD is currently working on a variety of nanoscale applications.
"Zinc oxide is a very good light emitter. Electrically driven zinc oxide single nanowire lasers could serve as high efficiency nanoscale light sources for optical data storage, imaging, and biological and chemical sensing," said Wang.
To make the p-type ZnO nanowires, the engineers doped ZnO crystals with phosphorus using a simple chemical vapor deposition technique that is less expensive than the metal organic chemical vapor deposition (MOCVD) technique often used to synthesize the building blocks of gallium nitride LEDs. Adding phosphorus atoms to the ZnO crystal structure leads to p-type semiconducting materials through the formation of a defect complex that increases the number of holes relative to the number of free electrons.

"Zinc oxide is wide band gap semiconductor and generating p-type doping impurities that provide free holes is very difficult – particularly in nanowires. Bin Xiang in my group worked day and night for more than a year to accomplish this goal," said Wang.
The starting materials and manufacturing costs for ZnO LEDs are far less expensive than those for gallium nitride LEDs. In the future, Wang expects to cut costs even further by making p-type and n-type ZnO nanowires from solution.
For years, researchers have been making electron-abundant n-type ZnO nanowire crystals from zinc and oxygen. Missing oxygen atoms within the regular ZnO crystal structure create relative overabundances of zinc atoms and give the semiconductors their n-type, conductive properties. The lack of accompanying p-type ZnO nanowires, however, has prevented development of a wide range of ZnO nanodevices.
While high quality p-type ZnO nanowires have not previously been reported, groups have demonstrated p-type conduction in ZnO thin films and made ZnO thin film LEDs. Using ZnO nanowires rather than thin films to make LEDs would be less expensive and could lead to more efficient LEDs, Wang explained.
Having both n- and p-type ZnO nanowires – complementary nanowires – could also be useful in a variety of applications including transistors, spintronics, UV detectors, nanogenerators, and microscopy. In spintronics applications, researchers could use p-type ZnO nanowires to make dilute magnetic semiconductors by doping ZnO with magnetic atoms, such as manganese and cobalt, Wang explained.
Transistors that rely on the semiconducting properties of ZnO are also now on the horizon. "P-type doping in nanowires would make complementary ZnO nanowire transistors possible," said Wang.

(c) www.physorg.com

Google Misses YouTube Anti-Piracy Deadline

Google Misses YouTube Anti-Piracy Deadline

Google rings in the new year without its promised anti-piracy protection scheme

Although YouTube rang in 2007 with a virtual New Year's Eve festival complete with a performance from Warner Music, live performances and participation from hordes of online members, the company failed to meet its self-imposed deadline to implement anti-piracy protection on its site. In an agreement with Warner Music Group, YouTube promised in September to have an anti-piracy system in place that would feature an "advanced content identification and royalty reporting system."

The anti-piracy system to be in place by the end of 2006 was a part of a deal which allowed Google to distribute Warner music videos, artist interviews and other music-related content. When Google acquired YouTube for $1.65 billion USD in October, it was widely expected that Google's deeper pockets would give YouTube the financial backing to implement such anti-piracy measures.

Missing the year-end deadline could be seen as a virtual pothole on the road to a more controlled distribution channel, but YouTube still can save face by getting the system live within the opening weeks of 2007. "It is hugely important, especially from the rights holders' perspective, that the best efforts are being made to corral the stuff flowing through YouTube," said Michael McGuire of Gartner Research. "Rights holders are making specific bets on paths of distribution and are expecting serious effort to make uncontrolled distribution difficult for most folks to do."

For now, YouTube is leaving the ball in the user's court when it comes to copyrighted music by telling users that uploading content "shall be at your sole risk."

(c) www.dailytech.com

Rich Skrenta: Google won already

by Anders Bylund

Computer industry heavyweight Rich Skrenta thinks that Google has essentially won the online search and advertising war, and that the entire Internet is Google's to enjoy, direct, and profit from. Does he mean "won" in the sense of Athens at the battle of Marathon, or how the Iraqi information minister used it? Let's have a closer look.

Skrenta, who played instrumental parts in Netscape's search strategy, building the Open Directory, and developing the Amiga Unix OS, bases his claim on Google's technological lead and, in turn, the massive mindshare the company has built up over the years. He cites a study reporting that users looking at Google search results under a Yahoo logo are inclined to think the results less relevant, and vice versa. Following this, he argues, it doesn't really matter if Ask or Microsoft can beat Google at the search game, because it has already won that battle in the hearts and minds of consumers.

And what follows from Google's position at the center of the 'Net is a massive monetizing opportunity—which Google nailed down as well. The AdSense program is undeniably successful, though Skrenta may overstate its position. He claims that "Google's CPMs are $90-120, vs. $4-5 for an average browse page view elsewhere." We're somewhat suspicious of those figures here at the Orbiting HQ, and they may be his own best guess based on ad revenues at his own sites.

Skrenta keeps repeating his pithy mantra, "Google is not your competition, Google is the environment." He then says that Yahoo, for example, should follow the lead of Ask.com and surrender to the Google money machine. By his calculations, the Y is leaving $1.7 billion of annual revenue on the table by trying to run its own advertising network rather than letting AdSense/AdWords run the show.

By making advertising partners out of even potential competitors, and backing the whole thing up with arguably the best search engine available (though you may have other favorites), the Mountain View online Brobdingnagian has cemented its place as the central point of the Internet, the one place where all the buyers and the vendors, the readers and the publishers, meet up to conduct their business. To beat Google now, somebody would have to come up with a better search and a better advertising solution, and then convince the general public and the business world, respectively, of these feats.

It sounds daunting indeed, and Mr. Skrenta makes sense on many levels. I use and like a lot of Google's products every day. The company has plans to grow beyond the invisible borders of the online world pretty soon, starting with some tentative radio and print advertising programs. Call it the world's librarian, making a few pennies off every piece of information we all consume. Still, no victory is everlasting. We'll just have to see how long Google can sit on this throne. 10 years on the 'Net is epoch. Will Google still be the leader 10 years from now?

(c) www.arstechnica.com

China residents logging on in greater numbers

The number of Internet users in China increased by 30 percent last year, according to one of the country's government-sponsored news agencies. The total number of Internet users in China is now approximately 132 million, of whom 52 million (about 39 percent) are using broadband. By comparison, there are approximately 207 million Internet users in the United States.

China now boasts more Internet users than any other country in Asia, followed by Japan with 86 million. Larger than the Internet populations of Africa, Australia, and the Middle East combined, China's 132 million makes up over 30 percent of the total Internet population of Asia, which is approximately 387 million. The total global number of Internet users is thought to be just over a billion—less than a sixth of the planet's inhabitants.

Although Internet access isn't quite ubiquitous yet in China, other communications technologies are extremely pervasive. According to statistics provided by the Chinese embassy, the number of cell phone users in China isgreaterr than 430 million, which means that the country could potentially account for almost half of all cell phone purchases around the world.

As China's Internet population continues to increase, it could become more difficult for the Chinese government to enforce and maintain its censorship mechanisms. Often criticized by the international community for jailing Internet journalists who criticize state policy, the Chinese government uses complex firewalls to limit the accessibility of content that is sexual or critical of the country's government. According to United Press International, Chinese Internet users are increasingly using fake identity card numbers in order to access web sites and Internet games anonymously.

(c) www.arstechnica.com

Microsoft Gives Free Laptops to Bloggers

Microsoft Gives Free Laptops to Bloggers

In a move to get favorable spins from key bloggers, Microsoft began sending them high-end notebooks loaded with Windows Vista Ultimate. Think it rubbed some people the wrong way?

One of the little publicized facts of the technology industry is that manufacturers and distributors routinely send out review hardware to press outlets, influential media figures, and others as a way to get publicity and raise awareness of their products. Many of these exchanges are on the up-and-up: journalists have to sign review agreements to return the units in a certain amount of time, and sometimes even provide credit information in the event the item is damaged or stolen. However, some manufacturers—particularly of low-cost or commodity items—don't care if the review units are returned, and sometimes items just arrive unsolicited: marketing and PR people are taking the chance that jut putting a product in proximity to a publisher or reviewer will make good things happen.

As common as these practices are, they create a bit of conundrum for journalists, since keeping or selling the items is effectively accepting a bribe and raises interesting ethical and tax issues.

This week, Microsoft has been rather publicly "outed" for sending full-loaded Acer Ferrari notebook computers pre-loaded with Windows Vista Ultimate and Office 2007 to selected high-profile bloggers. Microsoft contacted the bloggers directly and offered the "review units" with no strings attached, saying bloggers could write about them (or not), return the systems (or not), or give away the notebooks on their sites. Of course, since Vista is not yet available to consumers and a high-end laptop with everything pre-installed is a tempting offer, many of the bloggers jumped.

Some critics have argued Microsoft's practice is unethical; ethics aside, it certainly isn't uncommon in the technology industry, and is no doubt viewed by many as a savvy public relations move. A more pressing concern might be when reviewers fail to disclose how they receive review units, the terms attached to them, and any other conflicts of interest associated with their work. At least most of the bloggers Microsoft contacted have been up-front (even giddy) about their newfound boons; several say they plan to return the units to Microsoft.

Microsoft and PR firm Edelman have refused to say how many laptops were given to bloggers, or the units' value. Published reports have the number of systems between 80 and 100, with street prices ranging frm $2,000 to $2,4000 apiece.

(As a rule, I rarely review hardware or software, but, for the record, any review items I receive are either returned to the manufacturer or, with permission, donated to a charity or user group, and I disclose any relationship with the manufacturer or developer in the text of the review.)

(c) news.digitaltrends.com

Monday, January 1, 2007

No Longer "Made in Taiwan"

No Longer

Innovation over simple contract manufacturing is the key to Taiwan's economic future, or demise

"Made in Taiwan," the label ingrained in every American, is quickly becoming an icon of the 90's rather than the everyday slogan it once was.

Contract manufacturing now sits in Taiwan’s rear view mirror as the country enters its first age away from just manufacturing. Taiwan’s industry was primarily based on manufacturing but since the late 1990's Taiwanese companies have taken the turn to cell phone development and PC designs, adding innovation to their list of operations.

Four Taiwanese manufacturers control over 70% of notebook manufacturing currently. The other Taiwanese notebook brands control another 10% of the manufacturing development. As early as 2001 the capacity in Taiwan to build notebooks all but vanished. The "Big 4" in notebook assembly builds exclusively in China.

One of the largest outsourcers to Taiwan is Dell, the world’s second largest PC maker. Over 50 percent of Dell's product development is done in its Taiwan branch according to CNET. That branch began with 50 employees in the early 2002 and currently has over 300.

Taiwan's Industrial Technology Research Institute announced that Taiwan had overtaken Japan in mobile phone PCB production earlier this year. Taiwan Semiconductor Manufacturing Company, better known as TSMC, has a 50% global market share for semiconductor foundries alone. Taiwanese manufacturers have nearly a 100% global market share for retail motherboards, power supplies, PC cooling and enclosures. However, there is a trend with all of these big Taiwan claims: virtually all of the production is done inside China.

Taiwanese companies were barred from manufacturing technology outside of Taiwan on less than 0.25 micron nodes until yesterday. The Taiwanese Ministry of Economic Affairsjust declared the 180nm node legall for Taiwanese-owned, Chinese-based manufacturing.

Even with all of these strong indicators that Taiwan has controlling interests in the world of PC manufacturing; investment analysts are already weary of Taiwanese companies in 2007. The TAIEX, or Taiwan Capitalization Weighted Stock Index, has lagged behind other Asian markets for five consecutive years -- about the same time the country began moving its manufacturing base from Taiwan to China.

(c) www.dailytech.com

Tag Cloud