Search Results

Search found 24201 results on 969 pages for 'andrew case'.

Page 483/969 | < Previous Page | 479 480 481 482 483 484 485 486 487 488 489 490  | Next Page >

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • Getting error in internet shortcut

    - by MJM
    I create a file by url extension and type following text into in(its url is sample): [Internet Shortcut] URL=http://en.wikipedia.org/wiki/WebDAV In some case i gettin this error:"The Target "" of this Internet Shortcut is not valid. Go to the internet shortcut property sheet and make sure the target is correct." (for sample if in path or name of target file exist space character) my default browser in Firefox. I want have a internet shortcut that open in all browser and on al os. What can I fixed it problem? (Sorry if I am using the wrong terminology or grammar, I am self taught english language)

    Read the article

  • Why are there so few Wireless N Dual Band adapter PCI cards, only USB adapters instead?

    - by daiphoenix
    There has been several Wireless N Dual Band routers/APs out in the market for quite some time now, and there are several Wireless N Dual Band USB adapters out there. But as for PCI/PCI-X card adapters, there seems to be only one (the Linksys WMP600N). Why is that? I find it very strange. Is it because the USB adapters are easier to install, and can be used on multiple computers? But if so, why isn't it the same case with single band (2.4 Ghz) wireless N adapters? Because for these ones there as many PCI card adapters as there are USB adapters. Also, can the USB adapters, despite the lack of external antenna, offer the same level of performance as a card with external antennas?

    Read the article

  • When does a programmer know when a new job is not right?

    - by Mysterion
    I believe that the interview process is a selling of both parties - what can the employee offer the employer and vice versa. Assuming an individual has been careful in selecting their new employer (via thorough questioning in the interview process), however when they arrive at the job they find the employer has not been honest about certain aspects of the job. Examples of this dishonesty could include: The employee making it clear that technical excellence is an important factor, which is promised by the employer, but is not fully delivered or a good technical structure does not exist. The employee states they want to work on well architected and short (lets say less than 1 yr) long projects, yet when they start they find they are placed on a poorly architected older project. The employee being told of a pair programming environment to get him up to speed on the project, but being left to his own devices/questioning on arrival. The employee is promised a culture that encourages innovation and technical excellence but finds that this is not the case (eg. using technology for knowledge retention is laughed at). I know that a lot of famous developers feel that you make the place you work at. Is it realistic for a new employee with limited experience in the industry (say less than 5 years) to be able to join the company and change attitudes or even challenge the employer on the perceived dishonesty? Should they stay in this job or cut their losses?

    Read the article

  • Restoring a Windows 7 backup onto Server 2008 R2

    - by Colin Desmond
    I have recently moved my main machine from Windows 7 to Server 2008 R2. Just before I did the install, I used the Windows Backup facility to create a backup on a file share. I am now using the Server Backup facility in 2008 R2 to restore this and it is not working. When I use another Win7 machine, I point the restore program at the folder containing the backup fileset and it lets me look through them. In R2, I point it at the same folder and it tells me that "The specified remote share does not contain any backup." I naively assumed that the two systems would be compatible, is this not the case? Is there anyway to get that data back again?

    Read the article

  • scp -q isn't quiet between different hosts

    - by pythonic metaphor
    So scp -q file host:file and scp -q host:file file are both quiet, i.e. don't give the progress meter. But when I run scp -q host1:file host2:file, I still get the progress meter as well as a Connection to host1 closed. message. The progress meter can be gotten rid of by redirected stdout to /dev/null (although I'd rather not have to), but the connection closed messages comes on stderr, which I definitely want to keep in case there's a real error. How can I make scp quiet? Do I have to run ssh host1 "scp -q file host2:file"?

    Read the article

  • ubuntu to ubuntu backup in internal network

    - by amirash
    hey, i got my development "home" server witch is ubuntu 10, i brought today a computer in order to make a backup to this computer (the development server does also to him self backups every day but im paranoaid so i want to have two backups just in case on diffrent computers) what is the best way to backup the system core of the development server (like norton ghost) & do a full & incrmnt backup of him to the new computer that ive brought? rsync? rdiff? scp? clonezilla?

    Read the article

  • In vim, prevent caret moving back when I leave edit mode?

    - by romkyns
    In vim, if I enter and leave edit mode without doing anything, the caret ends up one character to the left. And if I enter and leave append mode, the caret moves forwards and then backwards. Any way to configure vim to leave the caret alone in these cases? Ideally I just want to always enter append mode, but without moving the caret when I enter or exit the mode. (Currently I usually use insert mode because it doesn't mess up my caret position upon entry. That is, except when I need to append to the end of the line, in which case I swear at vim for behaving in such an archaic fashion, press Esc and enter append mode.)

    Read the article

  • Network? or router? problem

    - by Robert H Mercer
    I have two computers networked through a Netgear WG914 router which is connected to an Ipstar satellite modem via the RG47 LAN connector. The main computer is running Windows 7 Pro 64bit and the secondary computer is running Windows 7 Home Edition 32bit. On occasion there is a loss of internet connection usually on the main computer, often followed by a loss on the secondary comuter. Not, however, always the case. Windows troubleshooter, naturally, is about as much use as a politician in government.......NONE. There appears to be no problem with the ISP bad as it is and often the connection will be remade without any help from me. The difference in the OS's is not of any relevance since I had this problem before I changed over to Windows 7. Not being a nerd when it comes to networking I wonder if anyone has any constructive suggestions.

    Read the article

  • XNA 2D Top Down game - FOREACH didn't work for checking Enemy and Switch-Tile

    - by aldroid16
    Here is the gameplay. There is three condition. The player step on a Switch-Tile and it became false. 1) When the Enemy step on it (trapped) AND the player step on it too, the Enemy will be destroyed. 2) But when the Enemy step on it AND the player DIDN'T step on it too, the Enemy will be escaped. 3) If the Switch-Tile condition is true then nothing happened. The effect is activated when the Switch tile is false (player step on the Switch-Tile). Because there are a lot of Enemy and a lot of Switch-Tile, I have to use foreach loop. The problem is after the Enemy is ESCAPED (case 2) and step on another Switch-Tile again, nothing happened to the enemy! I didn't know what's wrong. The effect should be the same, but the Enemy pass the Switch tile like nothing happened (They should be trapped) Can someone tell me what's wrong? Here is the code : public static void switchUpdate(GameTime gameTime) { foreach (SwitchTile switch in switchTiles) { foreach (Enemy enemy in EnemyManager.Enemies) { if (switch.Active == false) { if (!enemy.Destroyed) { if (switch.IsCircleColliding(enemy.EnemyBase.WorldCenter, enemy.EnemyBase.CollisionRadius)) { enemy.EnemySpeed = 10; //reducing Enemy Speed if it enemy is step on the Tile (for about two seconds) enemy.Trapped = true; float elapsed = (float)gameTime.ElapsedGameTime.Milliseconds; moveCounter += elapsed; if (moveCounter> minMoveTime) { //After two seconds, if the player didn't step on Switch-Tile. //The Enemy escaped and its speed back to normal enemy.EnemySpeed = 60f; enemy.Trapped = false; } } } } else if (switch.Active == true && enemy.Trapped == true && switch.IsCircleColliding(enemy.EnemyBase.WorldCenter, enemy.EnemyBase.CollisionRadius) ) { //When the Player step on Switch-Tile and //there is an enemy too on this tile which was trapped = Destroy Enemy enemy.Destroyed = true; } } } }

    Read the article

  • Why is apt-cache so slow?

    - by Damn Terminal
    After upgrade to Trusty (14.04) from Saucy (13.10), all apt operations are very slow. Even those that do not include downloading anything, or connecting to any servers. For example, displaying the apt policy # time apt-cache policy [...] real 0m8.951s user 0m5.069s sys 0m3.861s takes almost ten seconds! Mostly a weird lag right after issuing the command. And it's the same even if I issue the same command again. On another system it doesn't take a tenth of a second real 0m0.096s user 0m0.070s sys 0m0.023s The other system is a little beefier but there was no noticeable difference before the upgrade. It's the same with apt-get, anything apt-related. How do I find out the source of this lag and fix it? Additional info: # cat /etc/nsswitch.conf # /etc/nsswitch.conf # # Example configuration of GNU Name Service Switch functionality. # If you have the `glibc-doc-reference' and `info' packages installed, try: # `info libc "Name Service Switch"' for information about this file. passwd: compat group: compat shadow: compat hosts: files dns networks: files protocols: db files services: db files ethers: db files rpc: db files netgroup: nis BTW is my understanding of how apt-cache works correct? It doesn't make any network connections when I run apt-cache policy, right? In case I'm wrong and it matters, here are my sources https://gist.github.com/anonymous/02920270ff68e23fc3ec

    Read the article

  • EDQ Technical Enablement for OPN (Prague - June 17-19)

    - by milomir.vojvodic
    Oracle Enterprise Data Quality (EDQ) Technical Enablement and Partner Training Trusted Data for Your Enterprise Applications Oracle Enterprise Data Quality helps organizations achieve maximum value from their business-critical applications by delivering fit-for-purpose data. These products also enable individuals and collaborative teams to quickly and easily identify and resolve any problems in underlying data. With Oracle Enterprise Data Quality, customers can identify new opportunities, improve operational efficiency, and more efficiently comply with industry or governmental regulation. Oracle Enterprise Data Quality is designed to serve as a very channel friendly platform to OPN.  This means that pre-built extensions, components and even complete business solutions can readily be built and shared.  This allows our customers/partners to be highly efficient in how they deploy custom business solutions, but also allows our partners to develop specialized components, domain knowledge and even complete business solutions. Training is suitable for: · Database administrators · Architects · Technical staff Objectives of the training: After completing this course, participants should: · Have an understanding of the core functionality of EDQ across profiling, auditing, transforming, parsing and matching data · Be able to describe some of the key capabilities and benefits delivered by EDQ · Be able to create and run standalone EDQ processes and jobs · Be ready to start working with data from customers and (with practice) be able to demonstrate EDQ to customers Agenda 17th June Fundamentals For Demoing (Profile, Audit, Transform and More) Profiling Auditing Transforming Writing and exporting data Jobs and scheduling Publishing, packaging and copying EDQ processes Introduction to the Customer Data Extension Pack Realtime Processing via Web Services The Server Console Run Profiles Data Interfaces Sampling Publishing metrics to the Dashboard Users and security 18th June Matching Matching overview Basic matching configuration Matching rule hierarchies Clustering Merging Reviewing possible matches Outputting Match Data Case study 19th June Address Verification Address Verification Overview Configuration Accuracy Flags Parsing Parsing Overview Phrase profiling Tailoring a CDEP Parser Base Tokenization Classification Reclassification Selection Resolution Register Here Don’t miss this FREE event. Space is limited. Oracle University V Parku 2294/4 148 00 Praha 4 17.6. – 19.6. 2014 09:00 a.m.– 17:30 p.m.

    Read the article

  • Not able to track traffic on subdomain using Google Analytics

    - by Steven
    I'm trying to track traffic for my sub-domain, but it's not happening. This is how it's set up. My partner has a domain called sub1.partner.com. This domain points to partner1.mydomain.com. The idea is that users think they are browsing my partners website, when they are in fact browsing pages on my server. My tracking code looks like this: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_setDomainName', '.mysite.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); In Google analytics I've created a new account under my main account and called in partner1.mysite.com. On this account I have created a filter: Filter type: include Filter field: Host name Filter pattern: partner1.mysite.no Case sensetive: No What more can I try to track traffic on my subdomain? UPDATE Question 1 Is this line correct? _gaq.push(['_setDomainName', '.mysite.com']); Question 2 Is it correct that I have to add \ before any punctuations like so \. in filters?

    Read the article

  • Apache on Windows random long wait times

    - by Jaxbot
    I have a development machine with Apache installed as a service on Windows. The installation is fresh out of the box, with no changes to configuration aside from adding the PHP module. From day one, I've had a problem that looks like this: Essentially, Apache is freezing for about 11 seconds before replying on random requests. This appears to happen more frequently when the host hasn't been connected to in a while, but this is not always the case. I've eliminated MySQL, PHP, and the specific application; this long wait problem will occur even when loading a static file such as favicon.ico. Thus, the only factor remaining is Apache, which is freezing for consistently around 10-11 seconds before replying. The problem is not the DNS problem that many people point to; as you can see, the DNS lookup is instant, and the problem occurs both on localhost and 127.0.0.1. Thanks for the time.

    Read the article

  • local .pac-file URL format that works with IE and Safari (Windows)?

    - by legr3c
    Say I want to use a proxy auto-config file that is stored at C:\proxy.pac. To make Internet Explorer use this configuration I have to specify the pac-file in the LAN settings in the following way: file://C:/proxy.pac But Safari, that uses the same proxy settings, will ignore it in this case. To make Safari use the pac-file I have to reference it as file:///C:/proxy.pac (3 slashes at the beginning) which, according to Wikipedia is the correct format. But this way Internet Explorer will ignore it. Opera and Chrome, that also use the same proxy settings, are fine with both ways but is there another option that will work with Safari and Internet Explorer at the same time?

    Read the article

  • Excel: Look up function for combinations of cells in a single column

    - by Rebecca
    I'm looking to find the number of times a certain combination of values appears in a single column, I was hoping to do this in Excel but I'm starting to think it may not be possible. As an example, I have a list that looks like a longer vertical version of this: F1 F3 F2 F4 F1 F3 F4 F1 F3 F4 F1 F3 F4 And I want to know how many times a specific order (say F1 F3 F4) occurs, in this example 3 times (in my case the lookup sequences are 8 cells long). Is there a way to run over the whole column and identify the instances where this combination of cells occurs? I'm running Excel 2008 for Mac. Many thanks!

    Read the article

  • Windows Filtering Platform not turning off until admin logon. Win2008R2sp1

    - by rjt
    Just installed Windows Server 2008R2 SP1 to see if it would fix this problem, but it didn't. Until an administrator logs onto the domain controller, there are many events that WFP blocked a connection from Server60 to Server60 or Server60 to Server70. Both server60 and server70 are the domain controllers. One the admin logs on, the WFP events stop. The firewall is off by default GPO. Yes, i know that the WFP kicks in during the boot up sequence until the firewall takes over or in my case does not take over (since Vista), but i clearly should not have to autologon to a domain controller and call autolock or something. Example event LEVEL = Information Source = Microsoft Windows Security Auditing EventID = 5152 "Filtering Platform Packet Drop" and its evil twin id = 5157 "Filtering Platform Connection" "The Windows Filtering platform has blocked a connection." Direction %%14593 SourceAddress 192.168.10.60 SourcePort 49677 DestAddress 192.168.10.60 DestPort 389 Protocol 6 FilterRTID 65667 LayerName %%14611 LayerRTID 48 RemoteUserID S-1-0-0 RemoteMachineID S-1-0-0 windows-server-2008-r2 WFP BFE WindowsFilteringPlatform BaseFilteringEngine

    Read the article

  • Setting up a Google Analytics Campaign

    - by Ashfame
    I will be doing a bunch of things to give one of my projects (main app) a big initial push for which I will be building a few small Facebook apps which will help in promoting the main apps. Traffic from these apps need to be tracked individually. My main app will be posting on the walls when the user needs to be notified. Traffic from these posts need to be tracked. Traffic from emails sent by the main app need to be tracked, like different types of email. I need to track all of these & possibly a couple of more but I need to be sure that I build my campaign URLs correctly as I won't get another chance to fix it. Correct me where I am wrong: Campaign Name: Launch Campaign Medium: Email Campaign Source: Type1 or Type2 (I can break it down for different types of email, right?) For apps: Campaign Name: Launch Campaign Medium: Apps Campaign Source: App1 or App2 (I can break it down here for different apps, right?) What if I want to track two different links within a single email or a single app? Any way of tracking them individually too but still keeping to track them as one because tracking them as one makes more sense for me. Campaign Term & Campaign content is irrelevant in my case, or I can/should use them for something? And I will also be tracking traffic of different apps. Should I do more? Let me know if my scenario wasn't clear enough & I need to explain more.

    Read the article

  • iis windos authentication way to turn off password prompt

    - by Ellery Newcomer
    IIS noobie here. I have an intranet website hosted by IIS (6? I'll check after the weekend) that is set to Windows Authentication. Whenever there is some sort of authentication failure, the website gives the user a windows account sort of password prompt and doesn't display a 403 (or 401 or whatever) page unless the user cancels the prompt. However, entering a password is never, ever a use case for this website. Is there a way to turn off the password prompt and just display the error page? Bonus: are there any good hooks for this precise point for code that would do diagnosis and logging?

    Read the article

  • automated printouts from a wireless printer

    - by Piotr
    I have a wireless printer which is always on, and an always on fanless linux server. Looking at the mprinter project on Kickstarter I started to wonder if there is an app somewhere in the internet already that will allow to prepare an automated daily printout based on some settings. things to be printed could include - weather forecast for my locations, todo`s scheduled for that day, a "quote of a day" or "word of the day", stats from google analytics for my site, and many more ... I would set a printout at 6:15 every work day so its on my printer when I am already up, having a coffee. anyone knows something that can be used for such purpose? While I know this can be done by combining the power of TeX, cron and a script language to manage the dynamic part of the PDF I believe this is a use case someone has already addressed.

    Read the article

  • How to show a warning message when entering a folder?

    - by Valter Henrique
    I don't know if this is possible, but, I have a folder which I would like to show some warning message when the user enters in it. In my case would say that the folder could be deleted without previous warning to save some disk space. I already create a file inside the folder with the warning message: WARNING! ########################################################################################################################################################## Please, be advised, that the folder /company-backup/amazon-s3 can be deleted without previous WARNING to save disk space as the INFRASTRUCTURE TEAM judge necessary. Best regards, Infrastructure Team. ########################################################################################################################################################### Is that possible ? Any idea ?

    Read the article

  • Why some recovery tools are still able to find deleted files after I purge Recycle Bin, defrag the disk and zero-fill free space?

    - by Ivan
    As far as I understand, when I delete (without using Recycle Bin) a file, its record is removed from the file system table of contents (FAT/MFT/etc...) but the values of the disk sectors which were occupied by the file remain intact until these sectors are reused to write something else. When I use some sort of erased files recovery tool, it reads those sectors directly and tries to build up the original file. In this case, what I can't understand is why recovery tools are still able to find deleted files (with reduced chance of rebuilding them though) after I defragment the drive and overwrite all the free space with zeros. Can you explain this? I thought zero-overwritten deleted files can be only found by means of some special forensic lab magnetic scan hardware and those complex wiping algorithms (overwriting free space multiple times with random and non-random patterns) only make sense to prevent such a physical scan to succeed, but practically it seems that plain zero-fill is not enough to wipe all the tracks of deleted files. How can this be?

    Read the article

  • How can we implement network search for Windows AND OS X clients?

    - by michielvoo
    We have a network with Windows 7 and OS X (10.5 and 10.6) computers. Our servers run on Windows Server 2003 (1 Small Business Server, 2 Standard). We need to be able to search through about 15.000 - 30.000 documents in our archives. The best solution would be if users can search directly from the Windows menu (on Windows 7) or the Spotlight menu (on OS X 10.5 and 10.6). Also good would be if users can search directly from the search bar in their browsers, or by first visiting a site with the search form. In case the users search through the browser, it's important they they are able to open a file in the search results just by clicking on it. I have tested Microsoft Search Server Express, but it doesn't meet the requirements (no OS X support, results in the browser can't be opened by clicking in anything but Internet Explorer). I have looked at Spotlight server, but that only supports OS X. Thanks!

    Read the article

  • Ubuntu: crypt user's home directory and protect from admin ?

    - by Luc
    I have the following problem: I need to run some scripts on a Ubuntu machine but I do not want those scripts to be visible by anybody. What could be the best way to do that ? I was thinking of the following: create a particular user Add the scripts in this user's home directory Protect + crypt the user's home directory = Can I run the script from outside if the directory is crypted ? Can superuser see the content of the home dir ? Is there a right way to do this ? UPDATE I thing the best way would be that root own those scripts. In this case I would need to allow an another user to modify the network configuration. Is it possible to ONLY provide network rights to a user ? (via sudo or else)

    Read the article

  • Cron prepending filename to script output

    - by Caitifty
    I'm having an issue with unwanted lines being added to files output by a cron job. I have a script in /etc/cron.hourly which selects some data from a mysql database and saves it in a text file in /var/www. When I run the script as root, it does exactly what I expect it to do. When the script is executed by cron, it creates the same file, but prepends the following three lines at the top of the output file: :::::::::::::: /var/www/outputfilename :::::::::::::: I can't for the life of me work out how to stop this unwanted behavior. The line in /etc/crontab for cron.hourly is the default "44 * * * * root cd / && run-parts --report /etc/cron.hourly". If I use su to change to being root and do "cd / && run-parts --report /etc/cron.hourly" the script runs as expected and the output doesn't have the mysterious additional 3 lines. I've also tried removing the --report flag from the run-parts command in case that was somehow connected, but no joy. Finally, perusing the cron log output in /var/log/syslog just says cron.hourly ran without giving any additional information. Any suggestions on solving this weird problem most welcome..

    Read the article

< Previous Page | 479 480 481 482 483 484 485 486 487 488 489 490  | Next Page >