Search Results

Search found 22480 results on 900 pages for 'internet archive'.

Page 693/900 | < Previous Page | 689 690 691 692 693 694 695 696 697 698 699 700  | Next Page >

  • More Stuff less Fluff

    - by brendonpage
    Originally posted on: http://geekswithblogs.net/brendonpage/archive/2013/11/08/more-stuff-less-fluff.aspxYAGNI – "You Aren't Going To Need It". This is an acronym commonly used in software development to remind developers to only write what they need. This acronym exists because software developers have gotten into the habit of writing everything they need to solve a problem and then everything they think they're going to possibly need in the future. Since we can't predict the future this results in a large portion of the code that we write never being used. That extra code causes unnecessary complexity, which makes it harder to understand and harder to modify when we inevitably have to write something that we didn't think of. I've known about YAGNI for some time now but I never really got it. The words made sense and the idea was clear but the concept never sank in. I was one of those devs who'd happily write a ton of code in the anticipation of future needs. In my mind this was an essential part of writing high quality code. I didn't realise that in doing so I was actually writing low quality code. If you are anything like me you are probably thinking "Lies and propaganda! High quality code needs to be future proof." I agree! But what makes code future proof? If we could see into the future the answer would be simple, code that allows for or meets all future requirements. Since we can't see the future the best we can do is write code that can easily adapt to future requirements, this means writing flexible code. Flexible code is: Fast to understand. Fast to add to. Fast to modify. To be flexible code has to be simple, this means only making it as complex as it needs to be to meet those 3 criteria. That is high quality code. YAGNI! The art is in deciding where to place the seams (abstractions) that will give you flexibility without making decisions about future functionality. Robert C Martin explains it very nicely, he says a good architecture allows you to defer decisions because if you can defer a decision then you have the flexibility to change it. I've recently had a YAGNI experience which brought this all into perspective. I was working on a new project which had multiple clients that connect to a server hosted in the cloud. I was tasked with adding a feature to the desktop client that would allow users to capture items that would then be saved to the cloud. My immediate thought was "Hey we have multiple clients so I should build a web service for these items, that way we can access them from other clients", so I went to work and this is what I created.  I stood back and gazed upon what I'd created with a warm fuzzy feeling. It was beautiful! Then the time came for the team to use the design I'd created for another feature with a new entity. Let's just say that they didn't get the same warm fuzzy feeling that I did when they looked at the design. After much discussion they eventually got it through to me that I'd bloated the design based on an assumption of future functionality. After much more discussion we cut the design down to the following. This design gives us future flexibility with no extra work, it is as complex as it needs to be. It has been a couple of months since this incident and we still haven't needed to access either of the entities from other clients. Using the simpler design allowed us to do more stuff with less stuff!

    Read the article

  • Computer not displaying anything

    - by Justen
    The computer in question worked last week. It's brand new, never been connected to the internet, up-to-date parts, etc. Last week, I installed some 3D software on it, then shut it down and waited for the license files that arrived this week. I've done this to 4 other PCs with the same hardware and software. I've tried: Hooking it up to 3 different monitors using different video cables 2 other graphics cards and a handful of other things like switching which port it's plugged into before and after a restart. I have the driver disc, and a windows reinstallation disc, but they won't do me any good because no display at all for any interval of time. All the fans are moving: psu, gfx, cpu etc, so I don't believe it's a power issue. Here are the specs I know of: 4GB RAM 8800 GTX 700w PSU Intel dual core (not sure of model) Anyways, I'm open to ideas.

    Read the article

  • D-Link DWA-125 constantly disconnecting in winXP, but works fine on ubuntu..

    - by sil3nt
    Hey there, I'm dual booting winXp and ubuntu 10.10. I'm using a DWA-125 usb adapter to connect to my home wifi connection and I'm having this odd issue. First off, in ubuntu the adapter works fine and I have no trouble at all, but when I boot back in XP I get a dlink connetion wizard and the connection times out after every 2 minutes. It's odd because I can actually load up web pages and use the internet for 2 minutes, and then it would disconnect and the whole thing would repeat. I did a system restore on xp and this stopped the whole connect-disconnect thing but once I booted into ubuntu and came back to xp it started all over again!(?). Any ideas as to how I would go about fixing this?

    Read the article

  • Why do apt-get and wget fail on my server when ping is working?

    - by klox
    Yesterday my server still OK, but today after try to sudo apt-get update i got this error: update process. I try: sudo rm /var/lib/apt/lists/* -vf And got This.Then try update again, but it's not solving my problem then show May be still same error. I checked my internet connection try ping google.com, get result : PING google.com (74.125.235.40) 56(84) bytes of data. From 136.198.117.254: icmp_seq=1 Redirect Network(New nexthop: fw1.jvc-jein.co.id (136.198.117.6)) 64 bytes from sin01s05-in-f8.1e100.net (74.125.235.40): icmp_req=1 ttl=53 time=20.6 ms 64 bytes from sin01s05-in-f8.1e100.net (74.125.235.40): icmp_req=2 ttl=53 time=18.2 ms 64 bytes from sin01s05-in-f8.1e100.net (74.125.235.40): icmp_req=3 ttl=53 time=33.0 ms 64 bytes from sin01s05-in-f8.1e100.net (74.125.235.40): icmp_req=4 ttl=53 time=30.0 ms 64 bytes from sin01s05-in-f8.1e100.net (74.125.235.40): icmp_req=5 ttl=53 time=28.1 ms In some sites said that may be it caused by getdeb server is down. try to install: jeinqa@SVRQAR:~$ sudo apt-get install pastebinit Reading package lists... Error! E: Encountered a section with no Package: header E: Problem with MergeList /var/lib/apt/lists/security.ubuntu.com_ubuntu_dists_precise-security_restricted_binary-amd64_Packages E: The package lists or status file could not be parsed or opened. try : sudo ufw status verbose result : Status: inactive

    Read the article

  • Windows Azure Interop

    - by kaleidoscope
    How Windows Azure Platform is an open cloud platform. What makes it interoperable? The Windows Azure platform supports popular standards and protocols including SOAP, REST, and XML. Developers can use their preferred programming frameworks including .NET, and PHP, now. Tools such as Eclipse have been created for PHP developers for building Windows Azure applications. Now external endpoints (inbound traffic) have been enabled to worker a role, which enables applications that receive internet traffic that aren’t running under IIS. Windows Azure interoperable with Java At PDC 09, solution accelerator for Tomcat is delivered. Tomcat is an open source software implementation of the Java Servlet and JavaServer Pages technologies. The Windows Azure solution accelerator leverages a PDC09 feature that enable arbitrary processes to bind to inbound service endpoints. Windows Azure interoperable with PHP The Windows Azure tools for Eclipse extension builds upon the PHP Development Toolkit (PDT) and integrates Web Tools Platform (WTP) to provide a complete toolkit for Windows Azure web application development. For more details please refer to the link: http://www.microsoft.com/windowsazure/faq/   Rituraj

    Read the article

  • Oracle JDK 7u10 released with new security features

    - by Henrik Stahl
    A few days ago, we released JRE and JDK 7 update 10. This release adds support for the following new platforms: Windows 8 on x86-64. Note that Modern UI (aka Metro) mode is not supported. Internet Explorer 10 on Windows 8. Mac OS X 10.8 (Mountain Lion) This release also introduces new features that provide enhanced security for Java applet and webstart applications, specifically: The Java runtime tracks if it is updated to the latest security baseline. If you try to execute an unsigned applet with an outdated version of Java, a warning dialog will prompt you to update before running the applet. The Java runtime includes a hardcoded best before date. It is assumed that a new version will be released before this date. If the client has not been able to check for an update prior to this date, the Java runtime will assume that it is insecure and start warning the user prior to executing any applets. The Java control panel now includes an option to set the desired security level on a low-medium-high-very high scale, as well as an option to disable Java applets and webstart entirely. This level controls things such as if the Java runtime is allowed to execute unsigned code, and if so what type of warning will be displayed to the user. More details on the security settings can be found in the documentation. See below for a sample screenshot. The new update of the JRE and the JDK are available via OTN. To learn more about the release please visit the release notes.

    Read the article

  • Download a website that requires log-in with HTTtrack Copier

    - by H.Moss
    Hi guys! I have been researching of how to download content of a site that requires username and password. This is actually harder than I thought it would be. I tried to use both HTTtrack Copier and followed the instruction below, but it's not working! Q: I can not access several pages (access forbidden, or redirect to another location), but I can with my browser, what's going on? A: You may need cookies! Cookies are specific data (for example, your username or password) that are sent to your browser once you have logged in certain sites so that you only have to log-in once. For example, after having entered your username in a website, you can view pages and articles, and the next time you will go to this site, you will not have to re-enter your username/password. To "merge" your personnal cookies to an HTTrack project, just copy the cookies.txt file from your Netscape folder (or the cookies located into the Temporary Internet Files folder for IE) into your project folder (or even the HTTrack folder)

    Read the article

  • Is micro-optimisation important when coding?

    - by BozKay
    I recently asked a question on stackoverflow.com to find out why isset() was faster than strlen() in php. This raised questions around the importance of readable code and whether performance improvements of micro-seconds in code were worth even considering. My father is a retired programmer, I showed him the responses and he was absolutely certain that if a coder does not consider performance in their code even at the micro level, they are not good programmers. I'm not so sure - perhaps the increase in computing power means we no longer have to consider these kind of micro-performance improvements? Perhaps this kind of considering is up to the people who write the actual language code? (of php in the above case). The environmental factors could be important - the internet consumes 10% of the worlds energy, I wonder how wasteful a few micro-seconds of code is when replicated trillions of times on millions of websites? I'd like to know answers preferably based on facts about programming. Is micro-optimisation important when coding? EDIT : My personal summary of 25 answers, thanks to all. Sometimes we need to really worry about micro-optimisations, but only in very rare circumstances. Reliability and readability are far more important in the majority of cases. However, considering micro-optimisation from time to time doesn't hurt. A basic understanding can help us not to make obvious bad choices when coding such as if (expensiveFunction() && counter < X) Should be if (counter < X && expensiveFunction()) (example from @zidarsk8) This could be an inexpensive function and therefore changing the code would be micro-optimisation. But, with a basic understanding, you would not have to because you would write it correctly in the first place.

    Read the article

  • Minimizing data sent over a webservice call on expensive connection

    - by aceinthehole
    I am working on a system that has many remote laptops all connected to the internet through cellular data connections. The application will synchronize periodically to a central database. The problem is, due to factors outside our control, the cost to move data across the cellular networks are spectacularly expensive. Currently the we are sending a compressed XML file across the wire where it is being processed and various things are done with (mainly stuffing it into a database). My first couple of thoughts were to convert that XML doc to json, just prior to transmission and convert back to XML just after receipt on the other end, and get some extra compression for free without changing much. Another thought was to test various other compression algorithms to determine the smallest one possible. Although, I am not entirely sure how much difference json vs xml would make once it is compressed. I thought that their must be resources available that address this problem from an information theory perspective. Does anyone know of any such resources or suggestions on what direction to go in. This developed on the MS .net stack on windows for reference.

    Read the article

  • gigabit network with adsl modem

    - by user51006
    Hi, i would like to build a gigabit network to connect the different computers at my home. I already have an 100Mb adsl router/modem but i have an hard time to find a 1000Mb one, so i came up with the idea of using a normal 1000Mb router or switch to put between the adsl modem and the computers. So i only use the adsl modem to connect to the internet and the gigabit router as dhcp to connect the different computers. But to get to the question. Will the 100Mb adsl router slow down my network? Will data travel like pc gigabit router computer or pc gigabit router adsl router gigabit router pc. My gut says it will go like in the first option but i have seen network hardware do some weird stuff. What do you think?

    Read the article

  • Friday Fun: Abduction

    - by Mysticgeek
    Finally another Friday has arrived and it’s time to waste the afternoon on company time playing a flash game. Today we take a look at a fun game called Abduction. Abduction Abduction is a neat game where you snatch people and livestock to sell them on the intergalactic market.   The controls are basic using the arrow keys or W,A,S,D and the left mouse button. Here is the tutorial that you can play first to get the hang of it. While you’re abducting hillbillies, they throw pitch forks and other objects at your craft so you need to avoid them.   The game has several levels to keep you distracted until quitting time. Play Abduction at FreeWebArcade Similar Articles Productive Geek Tips Take Screenshots in Firefox the Easy WayFriday Fun: Portal, the Flash VersionFriday Fun: Play Bubble QuodFriday Fun: Gravitee 2Friday Fun: Compulse TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional How to Browse Privately in Firefox Kill Processes Quickly with Process Assassin Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet

    Read the article

  • Web Development Environment: How to distribute edited hosts files over bunch of mac machines?

    - by Alex Reds
    I am doing some research to prepare some web development environment for our small(10ppl and growing) new office. User Case: For each new web project usually we create new alias on an Apache server someproject.companywebsite From my understanding in order to see this website locally for all the rest of our team(including mangers and directors) they will need to edit hosts file (e.g. "192.168.1.10 someproject.companywebsite"), and like that each time for a new project(can be 2-5 each week) Solution: And I looking for a solution how to edit this hosts file only once and distribute it over all mac machines in our network at once or much more flawlessly than poking around with each machine every time over and over again. Is that possible? Or that a very wrong way of doing that? Perhaps we better set up own local dns server and point to it our router? Though own dns server a bit concerns me because of might be some network interruption and others lags, if you know what I mean. Or perhaps there are another workflows for that? What's the best way for such things? So I'll be so grateful to hear some advices from experienced admins. I couldn't find that info on internet, so if you know where to read about it, point me in a right direction. Thank you in advance Alex

    Read the article

  • Virtual firewall to protect hypervisor

    - by manutenfruits
    I am running an Ubuntu Server 12.10 as a single host connected to a NATed router connected using PPPoE to a optical fiber modem. This server is meant to be accessed from the Internet, but also to be used from the LAN as a SVN, MySQL and what not... The issue is that the router is not customizable enough to serve, so I was thinking about creating a virtual pfSense firewall using KVM inside of the server itself, removing the need of the router. Is this possible? Can the host ignore and block all traffic coming to itself, but not for the firewall? I am aware this is not the most desirable environment, I accept suggestions based on budget!

    Read the article

  • Block P2P traffic on a Linksys router WRT54G with Tomato firmware

    - by Kami
    I'm running a small wireless network (6 to 10 users) on a Linksys WRT54G with Tomato firmware sharing an Internet connection. I don't want the users to download files with BitTorrent (mainly used) and other P2P apps. I've also found some solutions about lowering P2P traffic priority using QoS. I really need to ban P2P traffic. Does anyone know how to setup some rules to deny that kind of traffic? I've tried to setup an Access Restriction Rule: However it's not working at all.

    Read the article

  • A different interface for the Sql Server Reporting Service?

    - by AngryHacker
    I have a SQL Server 2005 SQL Reporting Services implementation. It seems that the only way to actually access the reports is for the users to use Internet Explorer. The web page uses an ActiveX control to do its printing (and probably other functions as well). Does SSRS have a different way to access its functionality via the web browser? Like maybe Java or HTML based? If so, how do I actually turn it on? The reason I am asking is because the security is being tightened and ActiveX controls will be banished, thus the users won't be able to print.

    Read the article

  • Set up ad hoc wireless connection between Windows Vista and Mac OS X

    - by Skarab
    I have the following problem - Windows Vista does not connect to adhoc wireless network created on my Macbook. I have tried to create secured (with 40 bit key) and unsecured network but Windows Vista still has problems to connect. Windows VISTA informs me -- after 5 minutes of attempts - that setting up the connection -- with my adhoc network -- took too much time. My question: do I need to configure some settings on Vista to connect it to my Macbook? Maybe it is a problem with DHCP? Edited: I have tried the other way: http://superuser.com/questions/202890/set-up-an-adhoc-network-in-windows-vista-to-connect-to-and-share-the-internet-con

    Read the article

  • mod_proxy failing as forward proxy in simple configuration

    - by Stabledog
    (On Mac OS X 10.6, Apache 2.2.11) Following the oft-repeated googled advice, I've set up mod_proxy on my Mac to act as a forward proxy for http requests. My httpd.conf contains this: <IfModule mod_proxy> ProxyRequests On ProxyVia On <Proxy *> Allow from all </Proxy> (Yes, I realize that's not ideal, but I'm behind a firewall trying to figure out why the thing doesn't work at all) So, when I point my browser's proxy settings to the local server (ip_address:80), here's what happens: I browse to http://www.cnn.com I see via sniffer that this is sent to Apache on the Mac Apache responds with its default home page ("It works!" is all this page says) So... Apache is not doing as expected -- it is not forwarding my browser's request out onto the Internet to cnn. Nothing in the logfile indicates an error or problem, and Apache returns a 200 header to the browser. Clearly there is some very basic configuration step I'm not understanding... but what?

    Read the article

  • How can I share a video file during a webinar?

    - by Brien Malone
    Here is the scenario: I have a number of remote employees around the globe. I want to have a video chatting session. No problem there. Halfway through, I want to shut off all camera video feeds and simulcast (synchronous) a training video to my team. How do I do this? We have tried office communicator, but the frame rate was awful and no audio. Adobe Connect had similar trouble. In both cases we were limited by the main office's small internet pipe, but it is clear that video delivered by shared desktop is not a good solution.

    Read the article

  • My HP-Vista based laptop has become very slow recently

    - by goldenmean
    My HP laptop which has Vista Home premium. When I try to start Firefox, internet explorer, it becomes very slow. No other app. When i checked the Performance in Task Manager. It shows the Physical memory , Free as 0 bytes, almost always. This has been recently. Earlier it didn't used to be zero. Laptop has 2GB of RAM. I have nothing running in my tray except - Sound control, Laptop power plan indicator,Network status indicator. There are no other processes whose memory usage adds up to so high to make Free memory as 0. Then what could be hogging the memory and make the laptop very slow. Any pointers would help as it is crawling at the moment.

    Read the article

  • Win 2k3 server's network problem.

    - by Sam
    I'm running 4 of Win2k3 64bit servers in the same subnet. It's been more than an year that I've running them without a problem. Recently, I kept losing the connection to one of the server. Let's say it's 'server A' which has a problem. Losing the connection means that I can't access to server A from the other servers. I've checked if server A has any internet connection problems or are there any abnomal event logs in the eventvwr - but haven't found any problems. The problem usually resolved if I restart the server again. But as time goes by, it keeps happen again and again. I can't afford to restart the server every time, and I really want to find out the reason. Can anyone help me out? Let me know if you guys need any of more information.

    Read the article

  • Problem with sound driver

    - by JiminP
    I had problem that sound had lag in Flash. (Other than that, there was no problem.) On the internet, I found that installing OSS4 might help me. So I installed OSS4, but there was some problems (no sound on Flash, and couldn't use function key on the laptop - which is quite annoying), so I try to remove OSS4 and re-installing sound modules. After some mess-up, the whole sound was gone. I used Ubuntu for a year, but I don't know how to use the terminal well (All I know is basic commands like sudo, ls, or apt-get).. Now I'm trying to recover by following instructions at this page, but I have some problems... :( When I try to follow instructions: sudo aplay -l finds no sound drivers. find /lib/modules/'uname -r' | grep snd (backtick changed to ' due to code markup) returns nothing. When I try to do sudo apt-get install linux-restricted-modules-'uname -r' linux-generic, it says that it can't find linux-restricted-modules-3.0.0-13-generic package. lspci -v | grep -A7 -i "audio" returns this, which doesn't contain anything about the name of the driver. Writing sudo modprobe sn and pressing tab twice only returns sudo modprobe sn9c102. sudo aptitude --purge reinstall linux-sound-base alsa-base alsa-utils linux-image-'uname -r' linux-ubuntu-modules-'uname -r' libasound2 returns this, and didn't change non of above. sudo apt-get install linux-alsa-driver-modules-$(uname -r) fails because it can't find the package linux-alsa-driver-modules-3.0.0-13-generic. Compiling ALSA driver doesn't work. When I try to make, it says that /lib/modules/3.0.0-13-generic/build/include/linux/modversions.h doesn't exist. I'm using Ubuntu 11.10. Can anyone help me? I can re-install Ubuntu, but I don't want to....

    Read the article

  • pfsense CARP - wan failure on firewall

    - by eldblz
    I have recently configured 2 firewall (on 2 DELL PowerEdge R210II with ESXI 5.1) with pfsense. We have several LANs and 2 WANs. Everything is running fine but i have a strange behavior: i can access internet from alla LANs but not from the firewall (itself). For example the firewall cannot retrive package information or if i setup a gatway monitor ip (like google 8.8.8.8 ) this fails. These are the screenshots of firewall configuration: http://imgur.com/a/LNuMz#0 ATM i kept firewall rules to minimum to avoid problem or conflicts. Any ideas how to solve the problem? Thank you in advance.

    Read the article

  • Robocopy hiding folders on backup drives

    - by Neil Barnwell
    I have a backup batch file that uses Robocopy to backup my files: robocopy "C:\" "G:\Default\RoboCopyBackup\C" /XF Pagefile.sys /XD "System Volume Information" "Recycler" "Temporary Internet Files" "Installer Cache" "Temp" /E /R:1 /W:0 /TEE /XJ This should create a folder structure on the external backup drive like so: G:\Default\RoboCopyBackup\C\... However, G: appears totally empty. What is weird, is that the folders and files are there! If I type the above path into the address bar, I see all the files and folders! Can anyone help me work out why? I think it might be some NTFS-based ownership/permissions thing but I'm not sure.

    Read the article

  • Proxy Methods for Hosting a Low-Bandwidth Dynamic Website

    - by Casey
    I am building a webcam w/ HTTP server that will be running from a low-bandwith connection. The content on the site will be changing every 5 to 10 minutes. Instead of serving files directly from this connection, are there hosting companies that can act as a public proxy for my site? Therefore, if nobody is using the site, the local internet connection remains idle. And if I receive 1000 hits all at the same time, only one HTTP GET is required, and the hosting company (on a fat pipe) continues serving the other 999 requests? This doesn't sound like a very common usage model, but I feel like this would be the optimal solution to my situation.

    Read the article

  • Mikrotik server networks and Cain & Abel

    - by user269742
    I'm connected to the internet via a mikrotik server network. Recently, I read about that scaring application named Cain & Abel and all the capabilities it offered for malicious users. I don't know if anyone on my network is using or even aware of such application but my questions are: 1- How to protect myself from this program? 2- How to know if someone using such application against me? 3- Is Tor Bundle capable of protecting me from Cain & Abel? 4- If I filled my e-mail password via SSL page, Can Cain & Abel collect it? 5- Is it safe to use SKYPE or Yahoo Messenger voice chat if some one using Cain & Abel on my network?

    Read the article

< Previous Page | 689 690 691 692 693 694 695 696 697 698 699 700  | Next Page >