Jump to content

FML

+Veteran
  • Posts

    3,142
  • Joined

  • Last visited

  • Days Won

    20

Everything posted by FML

  1. '' Pentru sănătatea dumneavoastră,consultați un DEX ! '' .
  2. @God : Prietenii mei nu folosesc Laptop - uri . Doar Desktop - uri . @Gaby : Pot da ' ... Există și cea mai mică șansă ca acest DVI să facă la fel ca VGA - ul [ cel nou ] ?
  3. Eu am observat inconvenientul acesta de ceva timp ... Nici browser - ul meu nu e cel mai nou ... M - am reorientat spre IE .
  4. Tot nu înțeleg ce vrei cu Windows XP ... Tu când o formatezi [ Partiția C ] , o faci direct din Windows [ timp în care acesta este funcțional ] ? Cum ți - a zis și Gaby aici : , dezactivează acel Secure Boot din BIOS și vei putea face ce dorești . Orice partiție deținută de un sistem de operare se va formata din BIOS .
  5. Prefer calculul clasic [ al meu ] . Din câte îmi amintesc și calculatorul unui telefon mobil mai vechi poate calcula logaritmi și ... e dezastru la ce rezultate ți se dau ...
  6. Poftim : https://support.google.com/accounts/answer/32050?hl=ro .
  7. Bun , bun ... alte câteva nedumeriri ce dispar ...
  8. Fac build - uri cam des . Măcar îl țin fresh .
  9. M - am gândit eu la ceva . Un GPU are 2 mufe d'astea ... VGA , DVI + [ eventual ] - HDMI . Oare va fi vreo schimbare cu 1 cablu din acela cu VGA la un capăt și DVI la altul ? [ unde poate exista și un adaptor VGA - DVI sau nu ] .
  10. @God - când spui mufa de la PC , te referi la unde ar intra cablul care vine în unitate [ VGA ] sau cel de alimentare ? Nu pot încerca cu alt Desktop / Laptop . @Gaby - Da , 60Hz . Am pus și pe 75 . Nici o schimbare . Dungile apar permanent .
  11. Genial modul în care ai explicat - o . Big Like !
  12. ​Deoarece lipsă scoală .
  13. Precum puteți vedea și mai jos : http://www36.zippyshare.com/v/74705234/file.html , http://www19.zippyshare.com/v/58907971/file.html . El vrea să pună Windows XP peste 8 . Am întâmpinat și eu aceeași problemă de curând . Vezi tu , datorită unor diferențe de BIOS [ foarte mari ] nu poți pune XP reformatând un PC cu Windows 8 . Pe mine m - a ajutat să înțeleg , Dan , pe care l - am stresat o vreme cu întrebările . Totodata , mi - a dat un sfat extrem de bun pe care la rândul meu ți - l dau ție : Pune Windows 7 Professional . Făcând o paranteză , dacă ești gamer , eu zic să pui 7 PRO pe 64 de biți .
  14. Cred că de aproximativ 1 an , monitorul meu [ LG W1941 ] face niște dungi oblice transparente în mod constant . Am încercat să schimb rezoluțiile [ mai mici , diferite frecvente etc . ] , să înlocuiesc cablul VGA dar aceste dungi tot persistă . Interesant mi s - a părut că dacă mișc cablul , se mai răresc ... M - am jucat prin meniul monitorului [ Auto / Set , Contrast etc . ] , nu ajută ... Care ar putea fi cauza ?
  15. Deci și aici se pune problema de '' Administrator '' ca și la Windows .
  16. ​Cu chestiile astea m - am jucat și eu puțin pe alte comunități . Pe atunci foloseam PhotoScape , cât să nu deranjeze semnătura mea '' sofisticată '' . Nu cred că mă mai '' pricep '' ... Mai ales că între timp , cunoștințele mele [ ce - i drept destul de vagi ] au fuzionat cu ideea de PhotoShop CS3 , CS5 , CS6 ... Ce șanse am să te conving să mă instruiești și pe mine cât să reușesc cam ce poți face tu în momentul de față ?
  17. Vlad ... Nu îmi da idei =]] Bun articolul , eventual ... nu mai umbli prin service - uri doar pentru atât [ cum a făcut un coleg de clasă ] .
  18. With 4K / Ultra HD breaking into the mainstream both in computer monitors and televisions, you might be wondering what the capabilities of the four most popular connection types are, and which you should use. Welcome to our guide on the merits and pitfalls of HDMI, DVI, DisplayPort and good ole’ VGA. Learn what’s new, what’s old and what’s just straight-up outdated. HDMI These days, virtually all TVs and computer monitors support an HDMI connection. HDMI, which stands for High-Definition Multimedia Interface, shoots both digital audio and video down the same cable. Chances are, if you are trying to connect something to your television – and that includes computers – you’re going to want to use HDMI. HDMI is used in a very broad array of consumer electronics products, including laptop and desktop computers, mobile devices, the Chromecast dongle, Roku’s streaming stick, Blu-ray players, HD cable boxes, and much, much more – so it’s a familiar and appealing format for most folks, and easily the most popular among general consumers. Until very recently, HDMI v1.4 was the standard by which consumer electronics companies operated. There’s a good chance that all of the gear in your home is HDMI 1.4, but you should know that there’s a new version out, called HDMI 2.0, which takes HDMI’s capabilities to the next level. After the introduction of 4K/Ultra HD televisions came HDMI 2.0. HDMI 2.0 can pass video signals with a pixel resolution of 3820 x 2160 at up to 60 frames per second along with up to 32 channels of uncompressed multichannel digital audio, all through the same high-speed HDMI cables that have been around for years. That’s right: nothing about the cables or connectors has changed, only the hardware you connect them to. So there’s no need to expect to have to buy a bunch of new cables if/when you decide to upgrade. You can learn more about the latest version of HDMI right here. Since HDMI has progressed to this new version, there’s now even less reason to go with any of these other types of connections, except in some very specific situations, which we cover below. DisplayPort A digital display interface developed by the Video Electronics Standards Association (VESA), DisplayPort isn’t an option for consumer-level HDTV use (unless you plan on owning Panasonic’s top-of-the-line 4K TV, which is the only consumer television we are aware of that supports DisplayPort). However, DisplayPort is a perfectly capable option (some would say preferred) for connecting your PC to a computer monitor. With all of the necessary hardware add-ons and software updates, DisplayPort version 1.2 offers a maximum resolution of 3,840×2,160 at 60 FPS, which makes it ready to tackle 4K/Ultra HD content, and passes digital audio as well – just like HDMI. Despite HDMI’s prevalence today, DisplayPort boasts a couple of features that position it as a direct alternative – one that has earned its own cult of enthusiasts that swear by the connection type. Chief among them is DisplayPort’s multi-monitor capabilities, which make the format an excellent choice for graphic designers, programmers, and anyone else working with computers all day. Users can daisy-chain up to five monitors together in order to better streamline their working habits. There are many applications for such a setup – perhaps the most obvious and useful is the ability to reference material on one screen while typing on another, eliminating the constant alt-tab madness. While the current version of DisplayPort is 1.2, VESA recently announced plans to issues 1.2a, which is meant to tackle graphical tearing and stuttering problems by integrating something called Adaptive-Sync. The new technology will attempt to eliminate the problem by aligning a computer’s hardware to match up the system’s GPU with the monitor’s refresh rate. Adaptive-Sync is also rumored to be able to draw down refresh rates for less demanding tasks, possibly resulting in decreased power consumption. DVI DVI (Digital Visual Interface) rose to prominence as the standard display connection format around 1999, but over time HDMI has effectively replaced it. DVI is designed to deliver uncompressed digital video and can be configured to support multiple modes such as DVI-D (digital only), DVI-A (analog only), or DVI-I (digital and analog). The digital video signal passed over DVI ends is essentially identical to HDMI, though there are differences between the two formats, namely DVI’s lack of an audio signal. You won’t find DVI on HD televisions or Blu-ray players anymore, but you wouldn’t want to use DVI for your flat screen TV anyway, since additional audio cables would be required. But for computer monitors, which often lack speakers anyway, DVI is still a popular option. You’ll also find DVI connectors on some older projectors, usually hiding in some dusty corner of an office. If you want 4K, though, you’ll need to go with HDMI or DisplayPort. There are two different types of DVI connectors, single-link and dual-link. The dual-link DVI connector’s pins effectively double the power of transmission and provide higher speed and signal quality. For example, an LCD TV using a single-link DVI connector can display a maximum resolution of 1920×1200 – dual-link’s maximum for that same screen is 2560×1600. VGA Once the industry standard and now a video connector with one foot already out the door, VGA (Video Graphics Array) is an analog, video-only connection that’s rarely seen on TVs anymore, though you’ll still find it lingering on older PCs and projectors. At the end of 2010, a collective of big tech companies such as Intel and Samsung came together to bury VGA, announcing plans to abandon the format and speed up their adoption of HDMI and DisplayPort as default interfaces for PC monitors. We don’t recommend going out of your way to use VGA, but if it’s all you’ve got laying around – and you’re not particularly picky about picture quality – it will do in a pinch. Sometimes the 15-pin connector is referred to as “PC-RGB,” “D-sub 15” or “DE-15,” and some laptops and other smaller devices come with mini-VGA ports onboard, in place of the full-sized VGA connector. Conclusion If you’re connecting to a television, HDMI is the way to go. If you’re a gamer or find yourself on a computer all day, DisplayPort might be your best option, especially now that it has become more popular and, thus, supported. DVI and VGA are still solid computer monitor connections, but VGA is limited in its image quality potential, and won’t be a topic of conversation much longer. At the end of the day, we stand behind HDMI and DisplayPort as the top connections of choice. Read more: http://www.digitaltrends.com/computing/hdmi-vs-dvi-vs-displayport-vs-vga/#ixzz3GhscNfMP .
  19. New CPU and GPU architectures roil the market pretty much every year—sometimesmore than once a year. Yet in spite of the impact that system memory can have on a PC’s performance, the industry has relied on the same basic memory architecture for what seems like an eternity—in tech time, at least. DDR3 SDRAM (the third generation of double data rate synchronous DRAM) was introduced way back in 2007. Carrie Underwood had scored her first Grammy. Russian leader Boris Yeltsin died. And Barry Bonds broke Hank Aaron’s home-run record. Now the PC industry is finally preparing to transition to DDR4 memory. What took so long? Part of the reason for the long gestation period is that memory manufacturers compete more on price than performance. And unlike the CPU and GPU markets, where just two companies dominate the market, memory standards are developed by a committee: The Joint Electron Devices Engineering Council (JEDEC). If you want a standard to develop slowly, do it by committee (consider how long the IEEE is taking to ratify the 802.11ac Wi-Fi standard). DDR4’s lower power requirements—and the corresponding reduction in waste heat—will be this technology's real draw. JEDEC, which consists of every memory maker in the world, started work on the DDR4 spec in 2005—two years before DDR3 even hit the market—but the first test samples didn’t appear until 2011. DDR4 memory finally hit the market last year in very limited supply, but the industry finally shifted into high gear around Computex 2014. What’s so great about DDR4? Read on and the truth shall be revealed. What exactly is DDR4? There are a lot of deeply technical aspects to DDR4, but we won’t dive that far. The two key improvements in DDR4 are power consumption and data transfer speed, thanks to the development of an all-new bus. DDR4 memory will deliver significant benefits in terms of performance and power consumption. DDR3 generally requires 1.5 volts of electrical power to operate. DDR4 needs 20 percent less—just 1.2 volts. DDR4 also supports a new, deep power-down mode that will allow the host device to go into standby without needing to refresh its memory. Deep power-down mode is expected to reduce standby power consumption by 40- to 50 percent. Less power draw means less heat and longer battery life, so laptops and servers are expected to be the biggest beneficiaries of the jump to DDR4. Servers can be deployed with as much as a terabyte of memory and they routinely operate 24/7, so the power bills to keep them running—along with the onboard fans and outboard ventilation systems to keep them cool—can be enormous. Mid-range and high-end laptops routinely ship with 8GB of memory, so the 20-percent reduction in power consumption is more important for extending battery life than reducing utility bills. The LCD panel remains the biggest power draw, and the CPU eats its share of juice, but every little bit helps. Smartphones and tablets will benefit from DDR4 memory, too. Because they typically come with only 1GB or 2GB of memory—and their displays consume much more power than their memory—they'll benefit much like laptops will, from extended battery life rather than lower power bills. But that hasn’t stopped Qualcomm from getting into the game. Its Snapdragon 810 mobile processor uses low-power DDR4 memory, and devices using this chip are expected to ship in the first half of 2015. Qualcomm's new Snapdragon 810 processor will take advantage of DDR4 memory. Reducing power consumption will give desktop PC users a warm, green feeling, but they’ll probably appreciate DDR4’s speed bump a lot more. DDR4 memory kits shown off at Computex boasted speeds ranging from 2133MHz to 3200MHz, and DDR4 could eventually hit 4266 MHz. DDR3 memory topped out at 2133 MHz, so there’s no question memory will be a lot faster. Finally, DDR4 uses much higher-density chips, so each memory stick (DIMM, technically) will pack a lot more memory. Where you might buy DDR3 memory in 1- or 2GB kits for desktops and notebooks, expect to see 4- and 8GB kits with DDR4. And for high-end servers, each DDR4 DIMM could deliver 64- or even 128GB of memory. Do you need DDR4 memory? Will you ever? Before you get too excited about DDR4, note that it hasn’t even reached bleeding edge status. You can’t buy DDR4 memory today, and your existing hardware wouldn’t be able to use it if you could. But it’s a safe bet that it will be expensive when it does come to market. Mike Howard, memory analyst at the research firm IHS, said he expects DDR4 memory to launch later this year at prices 40- to 50 percent higher than DDR3 memory. So if you were to buy 16GB of DDR3 memory at the average price of $140, the same amount of DDR4 memory would set you back around $210. "As we go forward, DDR4 will get more engineering resources,” said Howard. “2016 is when we will see price parity with DDR3. Then it will get cheaper as more people put resources into it.” Improvements in memory technology occur at a relatively stately pace. Howard doesn’t consider DDR4 a must-have update for most people. “Users don’t need 2400MHz speeds,” he said. “In the PC world, except for the power-user segment, people aren’t screaming for more memory bandwidth.” Kelt Reeves, president of boutique PC builder Falcon Northwest, echoed that sentiment. “On current-generation CPUs, we see almost no benefit in DDR3 speeds above 1866MHz,” he said. “For 2133MHz and higher, you have to specifically run memory bandwidth tests to see anything outside of margin-of-error in most benchmarks.” According to Reeves, DDR4’s lower power requirements—and the corresponding reduction in waste heat—will be this technology's real draw. “Memory has become so much more reliable in recent years with the voltage drops from 2.1- to 1.8- to now 1.5 volts,” he said. An investment in DDR4 will also entail a motherboard upgrade, because you’ll need a new chipset. Intel’s upcoming X99 chipset will support DDR4 memory, along with a new Extreme Edition of its Haswell CPU (codenamed Haswell-E). And it’s precisely that power-user segment that would consider paying $1000 for Intel’s best processor. If that doesn’t describe you, you don’t need to worry about jumping into a major upgrade anytime soon, or even postponing your next PC purchase until models with DDR4 come out. That’s not to say DDR4 will be a waste of money. It’s just that in its early days, it won’t deliver significant benefits to anyone beyond the earliest of adopters. Source : http://www.pcworld.com/ .
  20. L - au dat și cei de la Ginx odată ... mă așteptam la cerințele astea . Este CoD , deci cred că oricum o să fie foarte apreciat .
  21. What is USB 3.0 and USB 2.0? Universal Serial Bus (USB) is an industry standard developed in the mid-1990s that defines the cables, connectors and communications protocols used in a bus for connection, communication and power supply between computers and electronic devices. Now even devices like smartphones, PDAs and video game consoles are connected to the computers with USB ports allowing recharging and communication thereby replacing the requirement of adapters and power chargers. USB3.0 was released in November 2008, almost eight years after the release of USB 2.0. USB 3.0 Highlights and Benefits over USB 2.0 Transfer rates - USB 2.0 offers transfer rates of 480 Mbps and USB 3.0 offers transfer rates of 4.8 Gbps - that's 10 times faster.Addition of another physical bus - The amount of wires has been doubled, from 4 to 8. Additional wires require more space in both the cables and connectors, so there are new types of connectors.Power consumption - USB 2.0 provides up to 500 mA whereas USB 3.0 provides up to 900 mA. The USB 3 devices will provide more power when needed and conserve power when the device is connected but idling.More bandwidth - instead of one-way communication, USB 3.0 uses two unidirectional data paths, one to receive data and the other to transmit while USB 2.0 can only handle only one direction of data at any time.Improved bus utilization - a new feature has been added (using packets NRDY and ERDY) to let a device asynchronously notify the host of its readiness.When data is being transferred through USB 3.0 Devices, cables and connectors transaction is initiated by the host making a request followed by a response from the device. The device either accepts the request or rejects it. If accepted then device sends data or accepts data from the host. If there is lack of buffer space or data, it responds with a Not Ready (NRDY) signal to tell the host that it is not able to process the request. When the device is ready then, it will send an Endpoint Ready (ERDY) to the host which will then reschedule the transaction. Physical Differences USB 3.0 Connectors are different from USB 2.0 Connectors and the 3.0 connectors are usually colored blue on the inside in order to distinguish them from the 2.0 connectors. Backward Compatible USB 3.0 is compatible with USB 2.0. However, the USB 3.0 product will perform at the same level as a USB 2.0 product, so speed and power benefits will not be fully realized. USB 3.0 receptacles are electrically compatible with USB Standard 2.0 device plugs if they physically match. USB 3.0 type-A plugs and receptacles are completely backward compatible, and USB 3.0 type-B receptacles will accept USB 2.0 and earlier plugs. However, USB 3.0 type-B plugs will not fit into USB 2.0 and earlier receptacles. This means that USB 3.0 cables cannot be used with USB 2.0 and USB 1.1 peripherals, although USB 2.0 cables can be used with USB 3.0 devices, if at USB 2.0 speeds. Here is a good informational video explaining the speed features of USB 3.0 vs USB 2.0 https://www.youtube.com/watch?v=7TCORt8b92U Price For a similar product, the USB 3.0 version is generally more expensive than it's USB 2.0 version. You can check the current prices on Amazon for a few USB 2.0 vs USB 3.0 enabled devices: USB 3.0 compatible productsUSB 2.0 compatible productsReferences: http://en.wikipedia.org/wiki/USB_3.0http://en.wikipedia.org/wiki/Universal_Serial_Bus#USB_2.0http://techie-buzz.com/gadgets-news/usb-3-advantages-disadvantages.htmlhttp://www.intel.com/content/www/us/en/io/universal-serial-bus/universal-serial-bus.htmlhttp://www.usr.com/education/peripherals0.aspMain article taken from http://www.diffen.com/. Video by Asus NA .
  22. What is the difference between SATA I, SATA II and SATA III? SATA I (revision 1.x) interface, formally known as SATA 1.5Gb/s, is the first generation SATA interface running at 1.5 Gb/s. The bandwidth throughput, which is supported by the interface, is up to 150MB/s. SATA II (revision 2.x) interface, formally known as SATA 3Gb/s, is a second generation SATA interface running at 3.0 Gb/s. The bandwidth throughput, which is supported by the interface, is up to 300MB/s. SATA III (revision 3.x) interface, formally known as SATA 6Gb/s, is a third generation SATA interface running at 6.0Gb/s. The bandwidth throughput, which is supported by the interface, is up to 600MB/s. This interface is backwards compatible with SATA 3 Gb/s interface. SATA II specifications provide backward compatibility to function on SATA I ports. SATA III specifications provide backward compatibility to function on SATA I and SATA II ports. However, the maximum speed of the drive will be slower due to the lower speed limitations of the port. Example: SanDisk Extreme SSD, which supports SATA 6Gb/s interface and when connected to SATA 6Gb/s port, can reach up to550/520MB/s sequential read and sequential write speed rates respectively. However, when the drive is connected to SATA 3 Gb/s port, it can reach up to 285/275MB/s sequential read and sequential write speed rates respectively. Source : http://www.sandisk.com/ .
  23. Da ... Doar am mai jucat și Bitefight .
  24. Vreau 1 sistem de operare alternativ [ la Windows ] .
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.