Search Results

Search found 2009 results on 81 pages for 'science fiction'.

Page 6/81 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • How Stuff Works: Fiber Optic Cables [Science]

    - by Jason Fitzpatrick
    Most people are familiar with the general concept of fiber optic cables–light as a method of data transmission–but how do they really work? Find out in this informative video. Bill Hammack, of Engineer Guy Videos, shows us how fiber optic cables work using–of all things–a bucket and a laser. Check out the above video for a glimpse inside how fiber optic cables work and how your analog voice can go to from your phone’s handset to a digital stream and then back to analog sound for the benefit of your friend on the end of the fiber optic transmission cable. Fiber Optic Cables: How They Work and How Engineers Use Them to Send Messages [YouTube] What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • How Fiber Optic Cables Are Made and Laid Across the Sea [Science]

    - by Jason Fitzpatrick
    We don’t know about you but yesterday’s video about how fiber optic cables work just made us more curious. Check out how the cables are made and laid across the sea. In the above video we see how fiber optic strands are manufactured, including how the draw tower mentioned in yesterday’s video works. Once the strands are manufactured, where do they go and how are they used? In the video below we see Alcatel-Lucent’s Ile de Sein, one of the largest and most powerful cable laying ships in the world. Check out the video to see cable storage wells that look like small stadiums. Finding out how the cables are made and what kind of planning and machinery it takes to lay them across the ocean is just as interesting as how they work. How It’s Made: Fiber Optics [YouTube] Undersea Cable [YouTube] What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • Five Fake Sounds Engineered to Make Your Feel Better [Science]

    - by Jason Fitzpatrick
    As objects in our environment (like cars, ATMs, and phones) have grown lighter and quieter scientists have been carefully engineering their sounds so that they continue to sound like we expect them to. Read on to see how. At the design blog Humans Invent they share five interesting ways that the world around us is being engineered so it sounds the way we expect it to. They start with the example of the car door. Years ago cars were almost entirely steel, the doors were weighty, and when you slammed them it sounded like one big hunk of steel locking into another big hunk of steel (which, in fact, it was). Newer cars are lighter but people still crave that substantial clunk. Humans Invent highlights the effect of consumer desire: A car door is essentially a hollow shell with parts placed inside it. Without careful design the door frame amplifies the rattling of mechanisms inside. Car companies know that if buyers don’t get a satisfying thud when they close the door, it dents their confidence in the entire vehicle. To produce the ideal clunk, car doors are designed to minimise the amount of high frequencies produced (we associate them with fragility and weakness) and emphasise low, bass-heavy frequencies that suggest solidity. The effect is achieved in a range of different ways – car companies have piled up hundreds of patents on the subject – but usually involves some form of dampener fitted in the door cavity. Locking mechanisms are also tailored to produce the right sort of click and the way seals make contact is precisely controlled. On average it takes 1.8 seconds to close a car door but in that time you’re witnessing a strange kind of symphony composed by engineers and designers whose goal is to reassure you that its rock solid. They mention lock mechanisms, something you may never have thought about. A friend of mine had a Ford Focus some years ago and that particular model had electric locks that, instead of giving a satisfying thunk or solid click, made this horrible gates-of-the-prison-buzzing sound that was completely unnerving. Hit up the link below to see how sounds are engineered for car doors, electric motors, ATM machines, and more. 5 Fake Sounds Designed to Help Humans [Humans Invent via Boing Boing] How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is CompromisedHow to Clean Your Filthy Keyboard in the Dishwasher (Without Ruining it)

    Read the article

  • The Science Behind Technological Moral Panics

    - by Jason Fitzpatrick
    Why do some new technologies cause ripples and reactionary backlash in society but others slip into our daily lives almost entirely uncontested? It turns out there’s a rather specific combination of things the new technology must do to upset the public. At Wired they highlight the work of Genevieve Bell and her studies of how society reacts to new technology: Genevieve Bell believes she’s cracked this puzzle. Bell, director of interaction and experience research at Intel, has long studied how everyday people incorporate new tech into their lives. In a 2011 interview with The Wall Street Journal‘s Tech Europe blog, she outlined an interesting argument: To provoke moral panic, a technology must satisfy three rules. First, it has to change our relationship to time. Then it has to change our relationship to space. And, crucially, it has to change our relationship to one another. Individually, each of these transformations can be unsettling, but if you hit all three? Panic! Why We Freak Out About Some Technologies but Not Others [Wired] How To Play DVDs on Windows 8 6 Start Menu Replacements for Windows 8 What Is the Purpose of the “Do Not Cover This Hole” Hole on Hard Drives?

    Read the article

  • Mobile or the Science of Programming Languages

    - by user12652314
    Just two things to share today. First is some news in the mobile computing space and a pretty cool new relationship developing with DubLabs and AT&T to enable a student-centric mobile experience for our Campus Solution customers. And second, is an interesting article shared by a friend on Research in Programming Languages related to STEM education, a key story element to my project with Americas Cup and iED, but also to our national interest

    Read the article

  • Distance Education in Computer Science - HCI

    - by Rionmonster
    I have been a software engineer / graphic designer for a few years and have recently been considering furthering my education in the field. (It was actually a very generous Christmas present) I would primarily be interested in something like Human Computer Interaction or a similar "creative technology" that involves heavy UI/UX Design, prototyping or Information Architecture. Anyways - I still plan on working full-time and was looking into part-time distance programs and was wondering if anyone had some experience with pursuing a similar degree (either from a distance or in-person) and could share their experiences. Thanks!

    Read the article

  • The Science of Creating a Money Making Keyword List

    Creating a money making keyword list is the foundation of your website marketing strategy. If your keyword list contains the right words or phrases then you will see plenty of traffic finding your site. If the keywords are not specific enough or are too general then your traffic will suffer and those that do happen to find your site may just look around a second or two and then move on.

    Read the article

  • The Science of Creating a Money Making Keyword List

    Creating a money making keyword list is the foundation of your website marketing strategy. If your keyword list contains the right words or phrases then you will see plenty of traffic finding your site. If the keywords are not specific enough or are too general then your traffic will suffer and those that do happen to find your site may just look around a second or two and then move on.

    Read the article

  • Can communications & computer engineer work as a programmer? [closed]

    - by Egy Prot
    I'm studying now communications & computer engineering. The professor told me that: Engineers have much hiring priority than Computer Scientists. It's good I've the ability to apply for the faculty of engineering. This'll help me to achieve my ambition to be a programmer. We'll study computer science. While I was browsing, I only saw computer science, computer science... etc. Courses vary from a faculty to another. Can he be right? If he's right, will employers prefer Computer Scientists or Engineers?

    Read the article

  • A Good Final High School AP Computer Science Programming Project?

    - by user297663
    Hey guys this question might seem very specific but I am in need of some ideas for a project to do for my last month or so in my AP Computer Science class. I've been looking at some college final ideas and a lot of them just seem plain boring. At first I thought about writing a IRC client in JAVA but I wouldn't really be learning anything "new" that would help me in the future. Then I thought about doing IPhone/touch apps (I don't have an adroid phone and I can easily get my hands on an itouch) but I would need ideas to make apps for that. I want to do something that is going to feel trivial and need some explanation but will also help me in the long run learning new concepts in computer science. If you guys could help out I would greatly appreciate it. I really only have a month to do this project so try to keep the project inside of that range. Also, I don't mind learning new languages. Thanks :)

    Read the article

  • Should I go to school and get my degree in computer science?

    - by ryan
    I'll try and keep this short and simple. I've always enjoyed programming and I've been doing it since high school. Right after I graduated from high school (2002), I opted to skip college because I was offered a software engineer position. I quit after a couple of years later to team up on various startup companies. However, most of them did not launch as well as expected. But it honestly did not matter to me because I've learned so much from that experience. So fast forwarding to today, now turned 25, I need a job due to this tough economic climate. Looking on Craigslist, a lot of the listings require computer science degrees. It's evident now that programming is what I want to do because I seem to never get enough of it. But just the thought of having to push 2 years without attending any real computer class for an Associates at age 25 is very, very discouraging. And the thought of having to learn from basic (Hello WOOOOORRLLLD) just does not seem exciting. I guess I have 3 questions to wrap this up: Should I just suck it up and go back to school while working at McDonalds at age 25? Is there a way where I can just skip all the boring stuff and just get tested with what I know? From your experience, how many jobs use computer science degrees as prerequisites? Or am I screwed and better pray that my next startup will be the next big thing?

    Read the article

  • If you could take one computer science course now, what would it be?

    - by HenryR
    If you had the opportunity to take one computer science course now, and as a result significantly increase your knowledge in a subject area, what would it be? Undergraduate or graduate level. Compilers? Distributed algorithms? Concurrency theory? Advanced operating systems? Let me know why. (Note that I appreciate this isn't a far fetched scenario - but time and inertia might be preventing people from taking the course or reading the book or whatever)

    Read the article

  • What areas of computer science are particularly relevant to mobile development?

    - by MalcomTucker
    This isn't a platform specific question - rather I'm interested in the general platform independent areas of computer science that are particularly relevant to mobile applications development. For example, things like compression techniques, distributed synchronisation algorithims etc.. what theoretical concepts have you found relevant, useful or enabling when building mobile apps?

    Read the article

  • Woman Is the World's First Computer Programmer? [closed]

    - by Sveta Bondarenko
    This week, on 10th December, we celebrate the 197th birth anniversary of Ada Lovelace, often considered as the world's first computer programmer. Ada became famous not only as a daughter of romantic poet Lord Byron but also as an outstanding 19th century mathematician. Her works on analytical engine are recognized as the first algorithm intended to be processed by a machine. Women always played a crucial role in the computer science evolution, but unfortunately, they are considered to be not so good at programming and engineering as men. Even though the fair sex makes up a growing portion of computer and Internet users, there is still a large gender gap in the field of Computer Science. But all is not lost! According to the study women's enrollment in the computer science raised from 7 percent in 1995 to 42 percent in 2000. And it is still increasing. Soon women will take a well-deserved position among the world's top computer programmers. After all, a number of notable female computer pioneers such as Ada Lovelace, Grace Hopper, and Anita Borg have proven that women make great computer scientists. But will women make great contributions to the modern technologies industry? Or successful and famous female computer programmer is just a pipe dream?

    Read the article

  • Fact or fiction: Webkit’s rendering performance is much lower when used in Qt

    - by Jen
    Using Qt’s Webkit implementation renders much slower than directly implementing the Webkit engine -- is this true or just a myth? From my own experience, I found the load time of a complex page about twice as long in Qt’s “Fancy Browser” example as it does in Google Chrome (which also incorporates a port of Webkit), but I hardly think that is a fair comparison. Any insights on this?

    Read the article

  • Master's in Software Engineering vs. Master's in Computer Science: which degree is preferred by empl

    - by dbarker
    I've been building software professionally for 7 years and am considering a master's degree. I understand the difference between these two degrees as simply: MSCS is the theory while MSE is the practice. I'm equally interested in both and would be happy with either, although I'm curious how these degrees rank in the eyes of a potential employer. I could see two views that a hiring manager could possibly take: a MSCS is loftier and has an implied knowledge of Software Engineering an MSE is more practical and has an implied knowledge of Computer Science In my own experience I've seen both MSCS degree holders than cannot program at all while others are among the best programmers I've met, so of course actual ability will depend on the individual. My question is about the "on paper" value of these two degrees when seeking a job. All things considered, is one degree more hirable, higher-paying than the other?

    Read the article

  • Cobol: science and fiction

    - by user847
    There are a few threads about the relevance of the Cobol programming language on this forum, e.g. this thread links to a collection of them. What I am interested in here is a frequently repeated claim based on a study by Gartner from 1997: that there were around 200 billion lines of code in active use at that time! I would like to ask some questions to verify or falsify a couple of related points. My goal is to understand if this statement has any truth to it or if it is totally unrealistic. I apologize in advance for being a little verbose in presenting my line of thought and my own opinion on the things I am not sure about, but I think it might help to put things in context and thus highlight any wrong assumptions and conclusions I have made. Sometimes, the "200 billion lines" number is accompanied by the added claim that this corresponded to 80% of all programming code in any language in active use. Other times, the 80% merely refer to so-called "business code" (or some other vague phrase hinting that the reader is not to count mainstream software, embedded systems or anything else where Cobol is practically non-existent). In the following I assume that the code does not include double-counting of multiple installations of the same software (since that is cheating!). In particular in the time prior to the y2k problem, it has been noted that a lot of Cobol code is already 20 to 30 years old. That would mean it was written in the late 60ies and 70ies. At that time, the market leader was IBM with the IBM/370 mainframe. IBM has put up a historical announcement on his website quoting prices and availability. According to the sheet, prices are about one million dollars for machines with up to half a megabyte of memory. Question 1: How many mainframes have actually been sold? I have not found any numbers for those times; the latest numbers are for the year 2000, again by Gartner. :^( I would guess that the actual number is in the hundreds or the low thousands; if the market size was 50 billion in 2000 and the market has grown exponentially like any other technology, it might have been merely a few billions back in 1970. Since the IBM/370 was sold for twenty years, twenty times a few thousand will result in a couple of ten-thousands of machines (and that is pretty optimistic)! Question 2: How large were the programs in lines of code? I don't know how many bytes of machine code result from one line of source code on that architecture. But since the IBM/370 was a 32-bit machine, any address access must have used 4 bytes plus instruction (2, maybe 3 bytes for that?). If you count in operating system and data for the program, how many lines of code would have fit into the main memory of half a megabyte? Question 3: Was there no standard software? Did every single machine sold run a unique hand-coded system without any standard software? Seriously, even if every machine was programmed from scratch without any reuse of legacy code (wait ... didn't that violate one of the claims we started from to begin with???) we might have O(50,000 l.o.c./machine) * O(20,000 machines) = O(1,000,000,000 l.o.c.). That is still far, far, far away from 200 billion! Am I missing something obvious here? Question 4: How many programmers did we need to write 200 billion lines of code? I am really not sure about this one, but if we take an average of 10 l.o.c. per day, we would need 55 million man-years to achieve this! In the time-frame of 20 to 30 years this would mean that there must have existed two to three million programmers constantly writing, testing, debugging and documenting code. That would be about as many programmers as we have in China today, wouldn't it? Question 5: What about the competition? So far, I have come up with two things here: 1) IBM had their own programming language, PL/I. Above I have assumed that the majority of code has been written exclusively using Cobol. However, all other things being equal I wonder if IBM marketing had really pushed their own development off the market in favor of Cobol on their machines. Was there really no relevant code base of PL/I? 2) Sometimes (also on this board in the thread quoted above) I come across the claim that the "200 billion lines of code" are simply invisible to anybody outside of "governments, banks ..." (and whatnot). Actually, the DoD had funded their own language in order to increase cost effectiveness and reduce the proliferation of programming language. This lead to their use of Ada. Would they really worry about having so many different programming languages if they had predominantly used Cobol? If there was any language running on "government and military" systems outside the perception of mainstream computing, wouldn't that language be Ada? I hope someone can point out any flaws in my assumptions and/or conclusions and shed some light on whether the above claim has any truth to it or not.

    Read the article

  • What is the best method of assessment for computer science students?

    - by Gavimoss
    This question is a bit more philosophical so feel free to remove if you like but it's been bugging me for the last 4 years! As a final year student I find that exams can be often be passed with a couple of days of cramming, without necessarily retaining or understanding the content i.e. a regurgitation of lecture notes is often enough to gain high marks. A friend of mine is about to graduate with an honours degree whose final year evaluation was based solely on practical work (a project, assignment marks and the creation of a poster) yet all of this work could have been completed by a third party. Personally I don't think either of these methods of assessment is sufficient as I am currently on track for a 1st class honours in artificial intelligence and computer science and believe this is mostly due to my skill in passing exams not my skill as a programmer or my vast in depth knowledge of any of the subjects I have "studied". Surely there is a better way to assess our skills - isn't there?

    Read the article

  • What computer science topic am I trying to describe?

    - by ItzWarty
    I've been programming for around... 6-8 years, and I've begun to realize that I don't really know what really happens at the low-ish level when I do something like int i = j%348 The thing is, I know what j%348 does, it divides j by 348 and finds the remainder. What I don't know is HOW the computer does this. Similarly, I know that try { blah(); }catch(Exception e){ blah2(); } will invoke blah and if blah throws, it will invoke blah2... however, I have no idea how the computer does this instead of err... crashing or ending execution. And I figure that in order for me to get "better" at programming, I should probably know what my code is really doing. [This would probably also help me optimize and... err... not do stupid things] I figure that what I'm asking for is probably something huge taught in universities or something, but to be honest, if I could learn a little, I would be happy. The point of the question is: What topic/computer-science-course am I asking about? Because in all honesty, I don't know. Since I don't know what the topic is called, I'm unable to actually find a book or online resource to learn about the topic, so I'm sort of stuck. I'd be eternally thankful if someone helped me =/

    Read the article

  • What are graphs in laymen's terms

    - by Justin984
    What are graphs, in computer science, and what are they used for? In laymen's terms preferably. I have read the definition on Wikipedia: In computer science, a graph is an abstract data type that is meant to implement the graph and hypergraph concepts from mathematics. A graph data structure consists of a finite (and possibly mutable) set of ordered pairs, called edges or arcs, of certain entities called nodes or vertices. As in mathematics, an edge (x,y) is said to point or go from x to y. The nodes may be part of the graph structure, or may be external entities represented by integer indices or references. but I'm looking for a less formal, easier to understand definition.

    Read the article

  • How to Make a 9 Layer Density Column [Video]

    - by Jason Fitzpatrick
    Density columns, layers of varying density liquid in a glass cylinder, are nothing new in the world of science demonstrations, but this nine layer one with seven floating objects is something to see. Courtesy of Steve Spangler Science, the experiment goes above and beyond the traditional five layer column by adding another four layers and sinking objects of varying density into the column. The end result is a colorful demonstration of the varying densities of liquids and solids. [via Boing Boing] How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems 7 Ways To Free Up Hard Disk Space On Windows

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >