Search Results

Search found 56140 results on 2246 pages for 'super computer'.

Page 34/2246 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • Computer making strange sound when turned on, ever since power outage

    - by Dot NET
    Recently we experienced a power outage, and the PC was off. However, once the power came back, I switched on the PC and heard a strange noise - almost as if the hard disk or fans were struggling to work. I can't really describe the sound, but it's a laboured, loud sound almost like a jack-hammer. This has been persisting ever since the power outage, however the noise stops after around 10 minutes or so, and doesn't start again until the computer is turned off and on again. At first I thought it had something to do with the HDD, but all my files are intact, chkdsk did not report any issues and performance is 100% unchanged, even in games (so the gfx card is fine, and so is the HDD most likely). My PC setup basically has around 3 cooling fans, but I'm not sure if it's one of these either as the noise actually stops after 10 minutes or so, and if I leave the PC on for 4 hours (for example) the noise never starts again. It's there solely when turning on the PC. I haven't got a UPS, and it's important to note that the computer was not on when the power went out - it was merely plugged in. I then promptly unplugged the PC once the power was out, and only plugged it in again when the power came back. Could it be the power supply? Unfortunately I can't open my tower as I would void the warranty. Are there any tests which I could carry out without voiding the warranty?

    Read the article

  • Computer will not boot - disk read error - cannot boot from HD or DVD

    - by Grant Palin
    This is a 3 year-old system: HP a1640n. There have been no issues with it in the past. I added a video card 2 years ago, and more memory 1 year ago, both without issues. There haven't been any recent hardware changes. I did install Win7 in Oct., but there were no issues with that either. I used the computer fine two nights ago, and turned it off. Yesterday, I tried to turn it on, and got the error: "A Disk Read Error Occurred. Press CTRL ALT DEL to restart" So I restart, see the initial start screen (HP) and enter the BIOS. The hard drive and dvd drive appear to be listed, but the names are gibberish text. I tried putting a Windows disk in the dvd drive, and continued with the boot, but the disk did not get recognized. Even though the BIOS was set to check for optical media before the hard drive. Back to the error screen. If the computer would boot from a cd or dvd, I would just figure the hard drive needed replacing. But both being problematic worries me. Is this a matter of replacing both the hard drive and dvd drive, or might it be an indication of a bigger problem? Thanks for any advice.

    Read the article

  • Some Portions of Computer Running Slow (Specifically Graphics)

    - by Mike Gates
    I noticed that a few things are running slow today on my Windows 7 laptop. Specifically, they are: Opening and closing windows takes several seconds for the animation to complete. Windows media player opens fine, but the movies are very laggy MMORPG's, such as RuneScape, are extremely laggy When waking my computer from sleep mode, after entering my password, my desktop takes about 3 seconds to fade in Other than those, everything runs at a normal speed. Things I've done that maybe contributed to this problem: Changed the graphics processor (by plugging in/unplugging the charger) [however, no matter how I change the graphics, I'm still getting this lagginess] Installed AdBlock, a Firefox addon [I recently removed it, and I'm still experiencing this problem] Went into Advanced System Settings, Clicked Settings, and unchecked a few visual things (such as the animation for opening and closing windows) [sure, this got rid of the opening/closing windows lag, but I like that little animation - plus that leaves all the other lag problems I'm experiencing] So, does anyone have any ideas/fixes? If so, please respond. Thank you. Some Other Information: I'm on a HP Pavillion dv7 laptop, 4285 Entertainment PC, with: intel CORE i5 inside, ATI Mobility Radeon Premium Graphics, Microsoft DirectX11 Opening and closing of windows: Defined as opening a program (i.e. Firefox) or closing it by hitting the X in the upper-right hand corner. Lately, the animation for opening and closing windows (which is simply either growing from the icon from the taskbar to fill the screen, or shrinking from the screen down towards the icon on the toolbar.) This problem also occurs for minimizing/maximizing windows. Very laggy movies: defined as .avi movie files saved to My Documents which skips several frames per second and seemingly slows down the movie as a whole Extremely laggy games: I tried RuneScape today, and movement in the game was at least 10x slower than it ever has been, even when playing on the lowest detail/graphics Desktop taking 3 seconds to fade in after sleep: in this scenario, I had no other programs running visibly. The computer generally fades to black from the password screen to the desktop in about 1 second, normally. However, it is now taking 3 or more seconds.

    Read the article

  • How can I access my desktop computer from my Android phone?

    - by Qurben
    Is it possible to access a computer connected to the internet through an Android phone? (the internet goes through the phone by tethering) I want to use ssh to connect to the computer (from a different computer in the same network), but I am not able to access the computer. Is it possible to portforward, use some kind of transparent proxy or to use DMZ? My phone is rooted and I have Cyanogenmod installed and I can use iptables. EDIT: The changed title completely changed the question! My setup is the following: I have an android phone connected to a computer through the usb cable tethering internet from the phone, I wanted to ssh into the computer behind the android phone from another computer in the same network as the android phone. This was not possible, because the android phone creates a separate network for the connected computer, effectively shielding it from any incoming signals. It turned out to be quite simple to fix by just using iptables.

    Read the article

  • Computer won't reboot without waiting for a while

    - by Benjamin
    I've got an unusual problem with my computer. When ever I reboot my computer it won't boot, I get a few beeps from the BIOS and nothing else, however if I wait for a few minuets the computer will boot perfectly. I tried to count the beeps and I get around 7-9 of them; the first two are noticeably closer together than the rest. [Edit: I'm now reasonably confident it's 1 long followed by 8 short beeps. That would be a display related issue: http://www.bioscentral.com/beepcodes/amibeep.htm] My BIOS is American Megatrends Inc and version P1.80, the Motherboard is an ASRock X58 Extreme (both according to dmidecode) Here's an output from LSPCI, I'm not sure what else might be useful but I can provide whatever's asked. 00:00.0 Host bridge: Intel Corporation 5520/5500/X58 I/O Hub to ESI Port (rev 13) 00:01.0 PCI bridge: Intel Corporation 5520/5500/X58 I/O Hub PCI Express Root Port 1 (rev 13) 00:03.0 PCI bridge: Intel Corporation 5520/5500/X58 I/O Hub PCI Express Root Port 3 (rev 13) 00:07.0 PCI bridge: Intel Corporation 5520/5500/X58 I/O Hub PCI Express Root Port 7 (rev 13) 00:14.0 PIC: Intel Corporation 5520/5500/X58 I/O Hub System Management Registers (rev 13) 00:14.1 PIC: Intel Corporation 5520/5500/X58 I/O Hub GPIO and Scratch Pad Registers (rev 13) 00:14.2 PIC: Intel Corporation 5520/5500/X58 I/O Hub Control Status and RAS Registers (rev 13) 00:14.3 PIC: Intel Corporation 5520/5500/X58 I/O Hub Throttle Registers (rev 13) 00:1a.0 USB controller: Intel Corporation 82801JI (ICH10 Family) USB UHCI Controller #4 00:1a.1 USB controller: Intel Corporation 82801JI (ICH10 Family) USB UHCI Controller #5 00:1a.2 USB controller: Intel Corporation 82801JI (ICH10 Family) USB UHCI Controller #6 00:1a.7 USB controller: Intel Corporation 82801JI (ICH10 Family) USB2 EHCI Controller #2 00:1b.0 Audio device: Intel Corporation 82801JI (ICH10 Family) HD Audio Controller 00:1c.0 PCI bridge: Intel Corporation 82801JI (ICH10 Family) PCI Express Root Port 1 00:1c.1 PCI bridge: Intel Corporation 82801JI (ICH10 Family) PCI Express Port 2 00:1c.5 PCI bridge: Intel Corporation 82801JI (ICH10 Family) PCI Express Root Port 6 00:1d.0 USB controller: Intel Corporation 82801JI (ICH10 Family) USB UHCI Controller #1 00:1d.1 USB controller: Intel Corporation 82801JI (ICH10 Family) USB UHCI Controller #2 00:1d.2 USB controller: Intel Corporation 82801JI (ICH10 Family) USB UHCI Controller #3 00:1d.7 USB controller: Intel Corporation 82801JI (ICH10 Family) USB2 EHCI Controller #1 00:1e.0 PCI bridge: Intel Corporation 82801 PCI Bridge (rev 90) 00:1f.0 ISA bridge: Intel Corporation 82801JIR (ICH10R) LPC Interface Controller 00:1f.2 SATA controller: Intel Corporation 82801JI (ICH10 Family) SATA AHCI Controller 00:1f.3 SMBus: Intel Corporation 82801JI (ICH10 Family) SMBus Controller 01:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 03) 02:00.0 FireWire (IEEE 1394): VIA Technologies, Inc. VT6315 Series Firewire Controller 02:00.1 IDE interface: VIA Technologies, Inc. VT6415 PATA IDE Host Controller (rev a0) 03:00.0 SATA controller: JMicron Technology Corp. JMB360 AHCI Controller (rev 02) 05:00.0 VGA compatible controller: nVidia Corporation GT200b [GeForce GTX 285] (rev a1) ff:00.0 Host bridge: Intel Corporation Xeon 5500/Core i7 QuickPath Architecture Generic Non-Core Registers (rev 05) ff:00.1 Host bridge: Intel Corporation Xeon 5500/Core i7 QuickPath Architecture System Address Decoder (rev 05) ff:02.0 Host bridge: Intel Corporation Xeon 5500/Core i7 QPI Link 0 (rev 05) ff:02.1 Host bridge: Intel Corporation Xeon 5500/Core i7 QPI Physical 0 (rev 05) ff:03.0 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller (rev 05) ff:03.1 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Target Address Decoder (rev 05) ff:03.4 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Test Registers (rev 05) ff:04.0 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 0 Control Registers (rev 05) ff:04.1 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 0 Address Registers (rev 05) ff:04.2 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 0 Rank Registers (rev 05) ff:04.3 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 0 Thermal Control Registers (rev 05) ff:05.0 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 1 Control Registers (rev 05) ff:05.1 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 1 Address Registers (rev 05) ff:05.2 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 1 Rank Registers (rev 05) ff:05.3 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 1 Thermal Control Registers (rev 05) ff:06.0 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 2 Control Registers (rev 05) ff:06.1 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 2 Address Registers (rev 05) ff:06.2 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 2 Rank Registers (rev 05) ff:06.3 Host bridge: Intel Corporation Xeon 5500/Core i7 Integrated Memory Controller Channel 2 Thermal Control Registers (rev 05) Update: ok I installed lm-sensors and here's the output. coretemp-isa-0000 Adapter: ISA adapter Core 0: +58.0°C (high = +80.0°C, crit = +100.0°C) Core 1: +59.0°C (high = +80.0°C, crit = +100.0°C) Core 2: +58.0°C (high = +80.0°C, crit = +100.0°C) Core 3: +57.0°C (high = +80.0°C, crit = +100.0°C) it8720-isa-0a10 Adapter: ISA adapter in0: +0.93 V (min = +0.00 V, max = +4.08 V) in1: +0.06 V (min = +0.00 V, max = +4.08 V) in2: +3.25 V (min = +0.00 V, max = +4.08 V) +5V: +2.91 V (min = +0.00 V, max = +4.08 V) in4: +3.04 V (min = +0.00 V, max = +4.08 V) in5: +2.94 V (min = +0.00 V, max = +4.08 V) in6: +2.14 V (min = +0.00 V, max = +4.08 V) 5VSB: +2.96 V (min = +0.00 V, max = +4.08 V) Vbat: +3.28 V fan1: 1869 RPM (min = 0 RPM) fan2: 0 RPM (min = 0 RPM) fan3: 0 RPM (min = 0 RPM) fan4: 1106 RPM (min = -1 RPM) fan5: 225000 RPM (min = -1 RPM) temp1: +39.0°C (low = +0.0°C, high = +127.0°C) sensor = thermistor temp2: +56.0°C (low = +0.0°C, high = +127.0°C) sensor = thermistor temp3: +127.0°C (low = +0.0°C, high = +127.0°C) sensor = thermistor cpu0_vid: +1.650 V intrusion0: ALARM If it helps here's the summery from sensors-detect Driver `it87': * ISA bus, address 0xa10 Chip `ITE IT8720F Super IO Sensors' (confidence: 9) Driver `adt7475': * Bus `NVIDIA i2c adapter 3 at 5:00.0' Busdriver `nvidia', I2C address 0x2e Chip `Analog Devices ADT7473' (confidence: 5) Driver `coretemp': * Chip `Intel digital thermal sensor' (confidence: 9)

    Read the article

  • this and super in java

    - by abson
    this and super are keywords isn't it, then how can we use them for passing arguments to constructors the same way as with a method?? In short how is it that these can show such distinct behaviours??

    Read the article

  • Advice for a computer science sophomore in college?

    - by RDas
    Hi Everyone! I'm a sophomore in college majoring in Computer Science and Math. I have always loved programming. I started programming in C when I was nine years old and over the years I've picked up Visual Basic, C#, Java, C++, JavaScript, Objective-C, Python, Ruby, elementary Haskell and elementary Erlang, and I learned Perl back in the day which I've mostly forgotten. I have not done much network programming. I have done CGI programming, but that was about six/seven years ago. I've done some socket programming and written (school) programs to do interprocess communication, which I understood and liked. I'm taking a course on client/server programming and another one on network security next semester, which I am really looking forward to. I'm seeking advice on how to proceed with future learning. I've mostly done application (mobile and desktop) development, not much of web development. I'd like to pick up some web development this coming semester. Since I know Ruby and Python, should I start by learning Django and/or Rails? Any other suggestions on starting web development? I have a good understanding of HTML and CSS. Also, I'd also like to know how hard it is to pick up and be good (read: productive) in functional programming languages coming from a purely structured/object oriented background? I've been reading up on Erlang and Haskell, and I'd like to know your opinions on whether it's worth my time trying to learn them. What about Lisp, Scheme and other functional languages? Any help/ideas would be really appreciated.

    Read the article

  • Nervous about the "real" world

    - by Randy
    I am currently majoring in Computer Science and minoring in mathematics (the minor is embedded in the major). The program has a strong C++ curriculum. We have done some UNIX and assembly language (not fun) and there is C and Java on the way in future classes that I must take. The program I am in did not use the STL, but rather a STL-ish design that was created from the ground up for the program. From what I have read on, the STL and what I have taken are very similar but what I used seemed more user friendly. Some of the programs that I had to write in C++ for assignments include: a password server that utilized hashing of the passwords for security purposes, a router simulator that used a hash table and maps, a maze solver that used depth first search, a tree traveler program that traversed a tree using levelorder, postorder, inorder, selection sort, insertion sort, bit sort, radix sort, merge sort, heap sort, quick sort, topological sort, stacks, queues, priority queues, and my least favorite, red-black trees. All of this was done in three semesters which was just enough time to code them up and turn them in. That being said, if I was told to use a stack to convert an equation to infix notation or something, I would be lost for a few hours. My main concern in writing this is when I graduate and land an interview, what are some of the questions posed to assess my skills? What are some of the most important areas of computer science that are prevalent in the field? I am currently trying to get some ideas of programs I can write in C++ that interest and challenge me to keep learning the language. A sodoku solver came to mind but am lost as to where to start. I apologize for the rant, but I'm just a wee bit nervous about the future. Any tips are appreciated.

    Read the article

  • How to learn the math behind the code?

    - by Solomon Wise
    I am a 12 year old who has recently gotten into programming. (Although I know that the number of books you have read does not determine your programming competency or ability, just to paint a "map" of where I am in terms of the content I know...) I've finished the books: Python 3 For Absolute Beginners Pro Python Python Standard Library by Example Beautiful Code Agile Web Development With Rails and am about halfway into Programming Ruby. I have written many small programs (One that finds which files have been updated and deleted in a directory, one that compares multiple players' fantasy baseball value, and some text based games, and many more). Obviously, as I'm not some sort of child prodigy, I can't take a formal Computer Science course until high school. I really want to learn computer science to increase my knowledge about the code, and the how the code runs. I've really become interested in the math part after reading the source code for Python's random module. Is there a place where I can learn CS, or programming math online for free, at a level that would be at least partially understandable to a person my age?

    Read the article

  • I need advice on laptop purchase for university [closed]

    - by Systemic33
    I'm currently in University studying Computer Science/IT/Information Technology. And this first year i've managed to do with the laptop I had; an ASUS Eee PC 1000H with a 10.1" screen. But this is getting way too underpowered and small for programming more than just quick programming introduction excercises. So I'm looking to buy a more suitable laptop. It's not supposed to be a desktop replacement though, since I've got a pretty good desktop already with a 24" monitor. So the kinda laptop I want to buy is one suited for university. If this bears any significance, I'm working in Java atm, but I will likely work with lots of other things incl. web development. I'm looking to spend about $1700 plus/minus. And it should be powerful/big enough for working on programming projects as well as the usual university stuff like MATLAB, Maple, etc out "in the field", and sometimes for maybe a week when visiting my parents. What I'm looking at right now is the ASUS Zenbook UX31A with the 1920 x 1080 resolution on 13.3" IPS display. But I'm kinda nervous that this will be too petite for programming. In essence i'm looking for a powerfull computer, that has good enough battery, and looks good. I would love suggestions or any type of feedback, either with maybe a better choice, or input on how its like programming on 13" laptops. Very much thanks in advance for anyone who even went through all that! PS. I don't want a mac, or my inner karma would commit Seppuku xD But experiences from working on the 13" Macbook Air would kinda be equivalent to the Zenbook i'm considering, so I would love to hear that. tl;dr The quick brown fox jumps over the lazy dog ;)

    Read the article

  • I need some career/education advice regarding computer science [on hold]

    - by user2521987
    So I'm a senior mathematics major this fall and I have only taken three CS classes (Java I, Java II, and C++). This summer, I am participating in a mathematics REU (Research Experience for Undergraduates), and I program in C++ about 8 hours a day...and I find that I absolutely love it. I love using programming to solve math problems in my research. I think I want to pursue a career in programming. I have a few options Stay at my university an extra 1-1.5 years (beyond the 4) and do a double major in Math/CS. This will put me in up to around 7-10k in debt (currently I have no debt and am scheduled to graduate debt free). Then apply to a masters in CS. Apply directly to a masters in CS from a math undergraduate degree. I don't like this idea because I likely won't get into a good program or funded with such little background. Go to graduate school, funded, in applied mathematics and try to further my knowledge in computer science while there. Then apply to a masters in CS. I'm not sure if 1 or 3 would be better. My end goal would be to go to a top 20-30 CS graduate program and to get a cool, good job. What would you recommend?

    Read the article

  • About the K computer

    - by nospam(at)example.com (Joerg Moellenkamp)
    Okay ? after getting yet another mail because of the new #1 on the Top500 list, I want to add some comments from my side: Yes, the system is using SPARC processor. And that is great news for a SPARC fan like me. It is using the SPARC VIIIfx processor from Fujitsu clocked at 2 GHz. No, it isn't the only one. Most people are saying there are two in the Top500 list using SPARC (#77 JAXA and #1 K) but in fact there are three. The Tianhe-1 (#2 on the Top500 list) super computer contains 2048 Galaxy "FT-1000" 1 GHz 8-core processors. Don't know it? The FeiTeng-1000 ? this proc is a 8 core, 8 threads per core, 1 ghz processor made in China. And it's SPARC based. By the way ? this sounds really familiar to me ? perhaps the people just took the opensourced UltraSPARC-T2 design, because some of the parameters sound just to similar. However it looks like that Tianhe-1 is using the SPARCs as input nodes and not as compute notes. No, I don't see it as the next M-series processor. Simple reason: You can't create SMP systems out of them ? it simply hasn't the functionality to do so. Even when there are multiple CPUs on a single board, they are not connected like an SMP/NUMA machine to a shared memory machine ? they are connected with the cluster interconnect (in this case the Tofu interconnect) and work like a large cluster. Yes, it has a lot of oomph in Linpack ? however I assume a lot came from the extensions to the SPARCv9 standard. No, Linpack has no relevance for any commercial workload ? Linpack is such a special load, that even some HPC people are arguing that it isn't really a good benchmark for HPC. It's embarrassingly parallel, it can work with relatively small interconnects compared to the interconnects in SMP systems (however we get in spheres SMP interconnects where a few years ago). Amdahl isn't hitting that hard when running Linpack. Yes, it's a good move to use SPARC. At some time in the last 10 years, there was an interesting twist in perception: SPARC was considered as proprietary architecture and x86 was the open architecture. However it's vice versa ? try to create a x86 clone and you have a lot of intellectual property problems, create a SPARC clone and you have to spend 100 bucks or so to get the specification from the SPARC Foundation and develop your own SPARC processor. Fujitsu is doing this for a long time now. So they had their own processor, their own know-how. So why was SPARC a good choice? Well ? essentially Fujitsu can do what they want with their core as it is their core, for example adding the extensions to the SPARCv9 chipset ? getting Intel to create extensions to x86 to help you with your product is a little bit harder. So Fujitsu could do they needed to do with their processor in order to create such a supercomputer. No, the K is really using no FPGA or GPU as accelerators. The K is really using the CPU at doing this job. Yes, it has a significantly enhanced FPU capable to execute 8 instructions in parallel. No, it doesn't run Solaris. Yes, it uses Linux. No, it doesn't hurt me ... as my colleague Roland Rambau (he knows a lot about HPC) said once to me ... it doesn't matter which OS is staying out of the way of the workload in HPC.

    Read the article

  • Graduating soon with a computer science degree, but have unique circumstances [closed]

    - by Donnie
    I joined the Navy in 1998, and was admitted into Nuclear Power Training. I got my electrician's mate certificate, but was put on medical hold when I was in Nuclear Power Training. I was sent to the Naval Hospital, and received a medical (honorable) discharge in the middle of 2000. I decided to stay at home and raise my son, and my girlfriend worked. a few years ago, I decided that I want to work as a programmer, so I went to college and will soon be graduating with a degree in computer science. I hope to finish with a relatively high GPA, 3.8 or 3.9. My question is this: How much, if any, of my Navy experience should I put on my resume? And how do I explain my nine year gap as a stay at home dad? Do I even try to explain it? I know recent college graduates typically have no experience, but obviously I'm not the typical college graduate. Will my long absence from working, or my relatively short duration in the Navy hurt my chances? Should I just put the college on my resume, and hope that HR thinks I'm younger than I am? Obviously, then, my age would show at the interview and there would be questions. Any help is appreciated.

    Read the article

  • The Most Ridiculous Computer Cameos of All Time

    - by Jason Fitzpatrick
    For the last half century computers have played all sorts of major and minor roles in movies; check out this collection to see some of the more quirky and out-of-place appearances. Wired magazine rounds up some of the more oddball appearances of computers in film. Like, for example, the scene shown above from Soylent Green: Spoiler alert: Soylent Green is people! But that’s not the only thing we’re gonna spoil. Soylent Green is set in 2022, and at one point, you’ll notice that a government facility is still using a remote calculator that plugs into the CDC 6600, a machine that was state-of-the-art in 1971. Come to think of it, we should scratch this from the list. This is pretty close to completely accurate. Hit up the link below to check out the full gallery, including a really interesting bit about how the U.S. Government’s largest computer project–once decommissioned and sold as surplus–ended up on the sets of dozens of movies and television shows. The Most Wonderfully Ridiculous Movie Computers of All Time [Wired] Why Enabling “Do Not Track” Doesn’t Stop You From Being Tracked HTG Explains: What is the Windows Page File and Should You Disable It? How To Get a Better Wireless Signal and Reduce Wireless Network Interference

    Read the article

  • Why is CS never a topic of conversation of the layman? [closed]

    - by hydroparadise
    Granted, every profession has it's technicalities. If you are an MD, you better know the anatomy of the human body, and if you are astronomer, you better know your calculus. Yet, you don't have to know these more advance topics to know that smoking might give you lung cancer because of carcinogens or the moon revolves around the earth because of gravity (thank you Discovery Channel). There's sort of a common knowledge (at least in more developed countries) of these more advanced topics. With that said, why are things like recursive descent parsing, BNF, or Turing machines hardly ever mentioned outsided 3000 or 4000 level classes in a university setting or between colleagues? Even back in my days before college in my pursuit of knowledge on how computers work, these very important topics (IMHO) never seem to get the light of day. Many different sources and sites go into "What is a processor?" or "What is RAM?", or "What is an OS?". You might get lucky and discover something about programming languages and how they play a role in how applications are created, but nothing about the tools for creating the language itself. To extend this idea, Dennis Ritchie died shortly after Steve Jobs, yet Dennis Ritchie got very little press compared to Steve Jobs. So, the heart of my question: Does the public in general not care to hear about computer science topics that make the technology in their lives work, or does the computer science community not lend itself to the general public to close the knowledge gap? Am I wrong to think the general public has the same thirst for knowledge on how things work as I do? Please consider the question carefully before answering or vote closing please.

    Read the article

  • why doesn't my computer resume after sleeping overnight?

    - by bamdad
    i'm having a weird, weird bug that's been haunting me since 11.10. if i listen to music or watch a video and my computer automatically goes to sleep at night, it won't properly resume in the morning. otherwise, suspend and resume works just fine. what happens is that the wi-fi and bluetooth indicator (that turns from white to orange when suspending) stays orange, the display doesn't turn on, and the only option i have is to hard reset the machine. here's what i've tried so far: installing (and uninstalling and reinstalling) laptop-mode-tools switching the proprietary wireless driver (broadcom-wl) to the open source one (brcmsmac & bcma) and back unloading (and blacklisting) all bluetooth modules (rfcomm, btusb, bnep, bluetooth) and stopping (# stop bluetooth) and disabling (# echo 'manual' /etc/init/bluetooth.override) the bluetooth service creating a custom pm sleep action as suggested here: http://ubuntuforums.org/showthread.php?p=11926504 not watching youtube / any stuff that uses flash before going to sleep (i have flashblock, and i checked $ ps aux | grep flash) because i suspected flash to be the culprit trying out different versions of fglrx (the one from the repos, then installing the latest one from amd's site via generated .deb files, then back to the official ones) none of these worked. i remember back in the days of 10.04, there was a gconf key called network sleep: i thought about disabling that, since re-enabling the wireless card seems to be the problem (according to the indicator led), but the option appears to be missing from gnome 3 (unity-2d, whatever). does anyone have any ideas? thanks, bamdad

    Read the article

  • Mobile development, recommended computer configuration?

    - by MikaelW
    Hi, For the last 4 weeks, I have been trying to get into mobile development. Done a couple of tutorials, read some books, developed a couple of dummy Android apps. The thing is my computer is a 5 years old laptop, it is slow and time has come to replace it and I’m looking at different offers online. Have you got any recommendations? Is there any must-have that should make my developer life easier in the future? Is there anything specific that may be useful at a more advanced stage of development that I just can’t think of right now on the hardware side? (I mean apart from good proc, lots of RAM, many USB ports...) One thing I can think of is to have three OS on the same workstation: Windows, Unix and MacOS (so far I focused on android/java/eclipse but am interested in Iphone/objC/xcode as well) but that’s more on the software side. Anyway, would be grateful for any recommendations. Thanks in advance! Mikael PS: I’m quite free on the budget side of things PPS: I'm aware it's not really a programming question but will still be of interest to some programmers here.

    Read the article

  • Majoring in computer science, but i'm not to sure I'm in the right field [closed]

    - by user74340
    Throught out my high school years and first year in college, I never thought of studying computer science. I studied biology and chemistry during my first year, and I didn't like the research, nor any type of medical professionals. So I took an introductory CS course, and loved the diverse roles this field can have. So I declare CS as my major. I finished first, and second year CS courses. Then now, I'm doing my co-op(intern) as a web developer. During my first and second year, I was always just an average student. My grades is around low B. But I put so much effort to understand my course' materials. I see many brilliants peers who not only excel at what they do, but have the passion. So I always doubt myself if I don't belong in this field. I'm not good at math, I usually get Cs on my math courses. My internship (a corporate developer job) is okay. But doesn't want to work like this after my graduation). Some aspects of CS that I like is HCI. In my experience in programming, and group projects, I enjoyed designing User interface, and thinking of user experience. I'm also thinking of taking some psychology courses.. I would appreciate any criticism, or advices.

    Read the article

  • My computer is broken after recent update attempt to 14.04

    - by user317550
    So it all started on a day much like today, because it is today but that's not the point, when I got a notification telling me I haven't upgraded to 14.04. Not due to lack of trying, however. It offered to upgrade me itself. Now keep in mind, I've tried very hard to upgrade my is from 12.04 to 14.04. Many times, I believe, due to tinkering where there shouldn't be tinkering, my BIOS are messed up. So upgrading is essentially impossible, but I wasn't about to stop it from updating for me, thinking it didn't have too much to do with BIOS as it doesn't reboot until after. So I let it go about its business and some time later I look back at it, and my unity sidebar is gone, and anytime there's text on screen it shows as those box things. The real bottom line is that I want to know my options. All of them. I would love to be able to keep the stuff on my hard drive so a hard drive swap may be an option if you guys say that would work. I just need my computer back. Let me know if I left anything out. Peace! B^)

    Read the article

  • Master's in Software Engineering vs. Master's in Computer Science: which degree is preferred by empl

    - by dbarker
    I've been building software professionally for 7 years and am considering a master's degree. I understand the difference between these two degrees as simply: MSCS is the theory while MSE is the practice. I'm equally interested in both and would be happy with either, although I'm curious how these degrees rank in the eyes of a potential employer. I could see two views that a hiring manager could possibly take: a MSCS is loftier and has an implied knowledge of Software Engineering an MSE is more practical and has an implied knowledge of Computer Science In my own experience I've seen both MSCS degree holders than cannot program at all while others are among the best programmers I've met, so of course actual ability will depend on the individual. My question is about the "on paper" value of these two degrees when seeking a job. All things considered, is one degree more hirable, higher-paying than the other?

    Read the article

  • How can I learn the math necessary for working with computer vision?

    - by Duncan Benoit
    I know that computer vision involves a lot of math, but I need some tips about how programmers gain that knowledge. I've started to use the OpenCV library but I have some major problems in understanding how the math works in the algorithms. In college I have studied some math and we worked with matrices and derivatives, but I didn't pay to much attention to the subject. It seemed to be so difficult and useless from a programmer point of view. I suppose that there has to be some easy way to understand what a second derivative is without calculating an equation. (Derivatives are just an example) Do you have any tips for me about how can i gain such knowledge? A forum, book, link, advice, anything?

    Read the article

  • What is the best method of assessment for computer science students?

    - by Gavimoss
    This question is a bit more philosophical so feel free to remove if you like but it's been bugging me for the last 4 years! As a final year student I find that exams can be often be passed with a couple of days of cramming, without necessarily retaining or understanding the content i.e. a regurgitation of lecture notes is often enough to gain high marks. A friend of mine is about to graduate with an honours degree whose final year evaluation was based solely on practical work (a project, assignment marks and the creation of a poster) yet all of this work could have been completed by a third party. Personally I don't think either of these methods of assessment is sufficient as I am currently on track for a 1st class honours in artificial intelligence and computer science and believe this is mostly due to my skill in passing exams not my skill as a programmer or my vast in depth knowledge of any of the subjects I have "studied". Surely there is a better way to assess our skills - isn't there?

    Read the article

  • Vista install works on one computer, but bluescreens another (on which Vista is known to work)

    - by Ken
    I hope my explanations make some sense -- please ask for clarification if they don't. I had a computer running Windows Vista (Ultimate, 64-bit). All was well! Then one day there was a nasty power surge at the office, and it died. (We didn't have surge protectors at the office, unfortunately. I assumed our lines were conditioned elsewhere, or was not an issue here. Oops.) After some testing, it was determined that the PSU, motherboard, and RAM were bad. While waiting for new hardware to arrive, I put my hard disk in a spare PC which had identical parts (mobo/CPU/RAM/PSU/video). Everything worked perfectly. The only way I could even tell it wasn't my computer is because Vista asked to re-activate itself with the new hardware, which worked fine, too. So the hard disk seems OK. Then the new parts arrived. The old motherboard model is no longer manufactured, so it's a new one with the same CPU/RAM/videocard/etc. slots. The PSU is also new, while the RAM I'm using is from the spare PC mentioned above. When I put it together and tried booting with my old hard disk, it starts to boot Windows, and then (fairly early in the process) gives a bluescreen and immediately reboots (so I can't see whatever the bluescreen is trying to tell me). I tried "safe mode", which also bluescreened. I tried booting the Vista DVD and running the repair utility, which found a Vista install, confirmed that it would not boot, and, eventually, declared that it was unable to repair it. I installed Vista fresh on a new hard disk, with the new mobo/etc., and it works perfectly. (That's what I'm running now.) I've also booted a Linux CD here, which ran great, and I've run Memtest86+ for a while, which found no errors. So all the hardware apart from the old hard disk seems OK, too. I don't think the problem is with my old Vista hard disk, since I used that with another mobo/CPU just fine. I don't think it's any other part of the new hardware, since I'm able to use it (and test it) with no trouble. It's just the combination of my old Vista install plus the new PC hardware that's not happy. I can get my data off my old hard disk and onto my new hard disk, and reinstall my apps, but it would be nice if I could fix things so I could continue to use my old hard disk as before. The latest hypothesis I've heard is that Vista had trouble with the new hardware (i.e., motherboard), but we have no idea what to do about that (except Safe Mode, which didn't work). Suggestions? Hypotheses for what's not right about this combination of Vista install and motherboard? Thanks!

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >