Search Results

Search found 2572 results on 103 pages for 'relative'.

Page 69/103 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • Does this exist: a standardized way of documenting a file-system structure

    - by eegg
    At work, I'm in charge of maintaining the organization of a whole lot of varied data on a standard file-system. Part of this is coming up with sensible classification (by similarity, need, read/write access, etc), but the bigger part is actually documenting it: what documents/files/media should go where, what should not be in this directory, "for something slightly different, see ../../other-dir", etc. At the moment, I've documented this using a plaintext file filing.txt in every directory I want to document. If someone is unsure what's meant to be in any directory, they read that file. This works alright, but it seems odd that I have this primitive custom solution to a problem that any maintainer of a non-trivial directory structure must experience. Every company I've known of, for example, has some kind of shared file-system where agreed terminology for categorization is important. In my experience, people just have to learn what's what by trial-and-error and experimentation. So allow me to propose a better solution, and hopefully you can tell me if it exists. Any directory on any filesystem can have a hidden plaintext file named .filing. Its contents are descriptive human language. It uses some markup like Markdown, with little more than bold, italic, and (relative) hyperlinks to other directories. Now a suitably-enabled file browser will check for a file named .filing whenever it displays a directory. If it exists, its contents are parsed and displayed in an unobtrusive pane near the directory-path widget. Any links therein can be clicked, and the user will be taken to the target directory of that link. I think that the effort of implementing such a standard would pay back many times over in usability gains. We would have, say, plugins for Nautilus, Konqueror, etc.. It could be used to display directory information in the standard file lists served by webservers. And so on. So, question: does such a thing exist? If not, why not? Do people think it's a worthwhile idea?

    Read the article

  • Green flickering pixels that move with black images

    - by user568458
    Strange question... Occasionally, on my LCD screen, pixels that should be black flicker rapidly and constantly between black and green, about 4 flickers a second. The crazy part is, unlike dead/stuck pixels, they are relative to content on the screen and move with it. For example, I might be looking at a web page with a picture that has lots of black. There might be a couple of green flashing pixels in that black that shouldn't be there. I scroll the page, and the green flickering pixels move with the image. It seems that everyphysical pixel is fine, but somehow something interprets part of the image in a way that causes flickering green... It's not just in a web browser. My first thought was to blame a trolling blogger cunningly uploading an animated gif that simulates a failing pixel... but it happens in a wide range of applications. It seems to occur randomly, other than that it seems to only occur in areas of pure black, and it's always pure 100% green. It happens rarely enough that it's not a big deal, but it's such a strange problem it bugs me. I can't find any info on anything like this. I'm not even sure if it's hardware or software. Any ideas? (windows 7 laptop connected to LCD by DVI to HDMI cable)

    Read the article

  • mod rewrite help

    - by Benny B
    Ok, I don't know regex very well so I used a generator to help me make a simple mod_rewrite that works. Here's my full URL https://www.huttonchase.com/prodDetails.php?id_prd=683 For testing to make sure I CAN use this, I used this: RewriteRule prodDetails/(.*)/$ /prodDetails.php?id_prd=$1 So I can use the URL http://www.huttonchase.com/prodDetails/683/ If you click it, it works but it completely messes up the relative paths. There are a few work-arounds but I want something a little different. https://www.huttonchase.com/prod_683_stainless-steel-flask I want it to see that 'prod' is going to tell it which rule it's matching, 683 is the product number that I'm looking up in the database, and I want it to just IGNORE the last part, it's there only for SEO and to make the link mean something to customers. I'm told that this should work, but it's not: RewriteRule ^prod_([^-]*)_([^-]*)$ /prodDetails.php?id_prd=$1 [L] Once I get the first one to work I'll write one for Categories: https://www.huttonchase.com/cat_11_drinkware And database driven text pages: https://www.huttonchase.com/page_44_terms-of-service BTW, I can flip around my use of dash and underscore if need be. Also, is it better to end the URLs with a slash or without? Thanks!

    Read the article

  • Adding Windows 7 32 bit as dual boot option

    - by djerry
    A relative of mine has bought a new laptop this year on which windows 7 (64 bit) is installed. Aside some standard programs he uses on that laptop, he also has some software for his bike that needs to run. The developers of that program still don't support 64-bit systems and therefor I thought about making it dual boot, so he can still use the power of the 64-bit, and just for the bike program, he can initiate the 32-bit version. My questions now are: What are the risks involved in this operation? What steps need to be taken to make this dual boot succesful? Any other ideas besides dual booting? Thanks in advance. Edit I might have forgotten/misphrased something. The software does run on 64-bit, but it cannot find the bike connected to the computer. So I think it's a matter drivers which aren't compatible with the 64-bit system. That's why I wanted to install the 32-bit windows so the drivers would work.

    Read the article

  • Completed downloads freeze Windows

    - by Ben Hooper
    The Issue Shortly after a file download via Google Chrome for Windows completes, the download will get stuck on "0 seconds left" and all other programs (except Google Chrome, for some reason, but browsing will not work) completely freezes into Windows' infamous "Not Responding" state, affecting Explorer particularly badly. Eventually, the programs will recover themselves but they will recover significantly faster if you cancel the file download, relative to how quickly you react. Performing the exact same operation immediately after cancelling the download usually works without issue. This issue occurs when with any file type (.ZIP, .MSI, .MSG, .PNG, .URL, etc) of any size from any source (Dropbox, SourceForge, Imgur, even tiny and locally-generated BLObs created by my own Chrome extension, etc) to any location.   Potential Causes As this issue is so inconsistent, I haven't been able to prove whether the issue is Chrome-specific or being caused by my system or my Chrome configuration but it's happening on both my work and home PCs. I originally suspected that this issue was being caused by security software scanning completed downloads for threats but I'm not as confident in that theory anymore as the issue persisted even after changing my security software from ESET NOD32 and Malwarebytes Anti-Malware Pro to ESET Endpoint to Microsoft Security Essentials.   System Information (of both PCs) Windows version: 7 Service Pack 1 64-bit Google Chrome version: 30.0.1599.101 (but has been happening for a long time)   Screenshots

    Read the article

  • How to include worksheet 3 and 4 in a cell formula provided?

    - by user21255
    I have been kindly given this formula with an explanation on how it works: Insert this formula into the cell B4 of the sheet "Cases": =IF(NOT(ISBLANK('1st'!B25)),'1st'!B25,IF(NOT(ISBLANK(INDIRECT("'2nd'!R" & (ROW($B4)-(COUNTA('1st'!$B:$B)-COUNTA('1st'!$B$1:$B$24))-4+25) & "C" & COLUMN(B4),FALSE))),INDIRECT("'2nd'!R" & (ROW($B4)-(COUNTA('1st'!$B:$B)-COUNTA('1st'!$B$1:$B$24))-4+25) & "C" & COLUMN(B4),FALSE),"")) Copy the formula to the other cells in the worksheet; the relative addresses will adjust automatically. The formula works like this: Check if there is content in 1st. If yes, copy it. If no, find out how many entries there are in 1st in total. (This is done by using the COUNTA function on the whole B column in 1st and subtracting the number of non-empty cells above the actual case data.) Use this information together with the current cells's number to find out the location of the cell that has to be copied from 2nd. Create the address of the cell and use the ISBLANK function on the INDIRECT function with that address to check if the cell is empty. If it is not, use the INDIRECT function again to display it. If it is empty, just display an empty string. Now this works fine when I have only 2 sheets. But lets say I want to include a third and fourth sheet (name as 3rd and 4th respectively), then what and should I put the formula for this in the formula above? There are actually 31 sheets but if I know how to add 3rd and 4th sheet in the formula, then I can figure out how to do the rest. Thanks

    Read the article

  • Apache DAV at `/` with normal hosting at `/foo` - how?

    - by mandrake
    Should I not be able to have a configuration where I serve SVN repos with SVNParentPath at <Location /> and then override DAV and host normal files using another location <Location /foo>? I wish to host my XSLT files on the same subdomain and still host repos at root. Of course, if I was to have a repo called foo, that would not be accessible, and that's ok. <VirtualHost *:80> ... #Host XSLT files here <Location /foo> DAV Off </Location> #Host my repos relative to root, such as /my_repo/ <Location /> DAV svn SVNParentPath "myrepos" SVNListParentPath on SVNIndexXSLT "/foo/my.xsl" ... </Location> </VirtualHost> But DAV SVN still looks for a repo: <?xml version="1.0" encoding="utf-8"?> <D:error xmlns:D="DAV:" xmlns:m="http://apache.org/dav/xmlns" xmlns:C="svn:"> <C:error/> <m:human-readable errcode="720003"> Could not open the requested SVN filesystem </m:human-readable> </D:error>

    Read the article

  • Issue with kernel boot [OVH SERVER]

    - by Conner Stephen McCabe
    Trying to install OpenVZ kernel on Centos 6.3, Yes my kernel is installed i can see it in the /boot folder, yes it is Rhel6 and yes it is all up to date, i checked this with yum update. My issue comes when i reboot my server with that kernel set as the default, it doesn't load, below i shall put a copy of my grub.conf file and my menu.lst file. Grub.conf: default=0 timeout=5 title vzkernel (2.6.32-042stab057.1) root (hd0,0) kernel /boot/vmlinuz-2.6.32-042stab057.1 ro root=/dev/sda1 initrd /initramfs-2.6.32-042stab057.1.img title linux centos6_64 kernel /boot/bzImage-3.2.13-xxxx-grs-ipv6-64 root=/dev/sda1 ro root (hd0,0) Now i shall paste in Menu.lst; # grub.conf generated by anaconda # # Note that you do not have to rerun grub after making changes to this file # NOTICE: You have a /boot partition. This means that # all kernel and initrd paths are relative to /boot/, eg. # root (hd0,0) # kernel /vmlinuz-version ro root=/dev/mapper/vg_stock-lv_root # initrd /initrd-[generic-]version.img #boot=/dev/sda default=0 timeout=5 splashimage=(hd0,0)/grub/splash.xpm.gz hiddenmenu title Linux OpenVZ (vmlinuz-2.6.32-042stab057.1) root (hd0,0) kernel /boot/vmlinuz-2.6.32-042stab057.1 ro root=/dev/mapper/vg_stock-lv_root rd_LVM_LV=vg_stock/lv_root rd_LVM_LV=vg_stock/lv_swap rd_NO_LUKS rd_NO_MD rd_NO_DM LANG=en_US.UTF-8 SYSFONT=l$ initrd /initramfs-2.6.32-042stab057.1.img title CentOS (2.6.32-71.el6.x86_64) root (hd0,0) kernel /boot/bzImage-3.2.13-xxxx-grs-ipv6-64 ro root=/dev/mapper/vg_stock-lv_root rd_LVM_LV=vg_stock/lv_root rd_LVM_LV=vg_stock/lv_swap rd_NO_LUKS rd_NO_MD rd_NO_DM LANG=en_US.UTF-8 SYSFO$ initrd /initramfs-2.6.32-71.el6.x86_64.img # dummy text Somebody mentioned something about OVH having added a script which changes the kernel settings or something, and suggested that we either remove the script or reinstall using a VNC, but we don't know how to go about doing either of these? Really would be great if you guys could help. Thanks in advance.

    Read the article

  • How can I move linked Word/Excel files without breaking the links under Windows 7?

    - by DOUG NEEDHAM
    I currently operate under Windows XP and have multiple links between my Word and Excel files. I have to upgrade to Windows 7. When the .doc and .xls files are converted to .docm and .xlsm, respectively, the links no longer work. The Word document is still attempting to point back to the old .xls file rather than the new file. Also, creating new links between Word and Excel within Office 2010 doesn't seem to work. I create the new link, switch it from "Auto" to "Manual" and everything works fine. But when I copy the files to another folder, the Word document is still trying to link to the file in the previous folder rather than the new folder. This always worked in Windows XP. I've been using linked Word/Excel documents for 10+ years and have never really had a problem. I'm very careful to maintain Word and Excel filenames when moving the files to a new folder. The process has always been to 1.) move the files, 2.) update the links, 3.) rename the files, and 4.) update the links again. It's my understanding that under Windows XP, links between Word and Excel are relative. But under Windows 7 (and Office 2010?), those same links become fixed.

    Read the article

  • CentOS 6.5 new Kernel not active after reboot

    - by Kristofer
    Today I was running some yum updates and wanted to verify that everything went through fine by making sure I had a new kernel. To my surprise I noticed that CentOS was still running 2.6.32-431.5.1.el6.x86_64 even though it looked as though 2.6.32-431.23.3.el6 was installed. Indeed 2.6.32-431.23.3.el6 shows up in /etc/grub.conf but not in the upstart boot options. Any ideas why? In the update log it says: ---> Package kernel-firmware.noarch 0:2.6.32-431.5.1.el6 will be updated ---> Package kernel-firmware.noarch 0:2.6.32-431.23.3.el6 will be an update Could this be the reason? What does "will be an update" mean? My /etc/grub.conf: # grub.conf generated by anaconda # # Note that you do not have to rerun grub after making changes to this file # NOTICE: You have a /boot partition. This means that # all kernel and initrd paths are relative to /boot/, eg. # root (hd0,0) # kernel /vmlinuz-version ro root=/dev/mapper/VolGroup00-root # initrd /initrd-[generic-]version.img #boot=/dev/vda default=0 timeout=5 splashimage=(hd0,0)/grub/splash.xpm.gz hiddenmenu password --encrypted $1$auui(i$sODM4ni/Zts9IlMWu.wWF/ title CentOS (2.6.32-431.23.3.el6.x86_64) root (hd0,0) kernel /vmlinuz-2.6.32-431.23.3.el6.x86_64 ro root=/dev/mapper/VolGroup00-root rd_NO_LUKS LANG=en_US.UTF-8 KEYBOARDTYPE=pc KEYTABLE=sv-latin1 rd_NO_MD rd_LVM_LV=VolGroup00/swap SYSFONT=latarcyrheb-sun16 crashkernel=auto rd_LVM_LV=VolGroup00/root rd_NO_DM rhgb quiet rhgb quiet audit=1 initrd /initramfs-2.6.32-431.23.3.el6.x86_64.img title CentOS (2.6.32-431.5.1.el6.x86_64) root (hd0,0) kernel /vmlinuz-2.6.32-431.5.1.el6.x86_64 ro root=/dev/mapper/VolGroup00-root rd_NO_LUKS LANG=en_US.UTF-8 KEYBOARDTYPE=pc KEYTABLE=sv-latin1 rd_NO_MD rd_LVM_LV=VolGroup00/swap SYSFONT=latarcyrheb-sun16 crashkernel=auto rd_LVM_LV=VolGroup00/root rd_NO_DM rhgb quiet rhgb quiet audit=1 initrd /initramfs-2.6.32-431.5.1.el6.x86_64.img title CentOS (2.6.32-431.el6.x86_64) root (hd0,0) kernel /vmlinuz-2.6.32-431.el6.x86_64 ro root=/dev/mapper/VolGroup00-root rd_NO_LUKS LANG=en_US.UTF-8 KEYBOARDTYPE=pc KEYTABLE=sv-latin1 rd_NO_MD rd_LVM_LV=VolGroup00/swap SYSFONT=latarcyrheb-sun16 crashkernel=auto rd_LVM_LV=VolGroup00/root rd_NO_DM rhgb quiet rhgb quiet audit=1 initrd /initramfs-2.6.32-431.el6.x86_64.img

    Read the article

  • How to detect/list rogue computers connected to a WIFI network without access to the Wifi Router interface? [migrated]

    - by JJarava
    This is what I believe to be an interesting challenge :) A relative (that leaves a bit too far to go there in person) is complaining that their WIFI/Internet network performance has gone down abysmally lately. She'd like to know if some of the neighbors are using her wifi network to access the internet but she's not too technically savvy. I know that the best way to prevent issues would be to change the Router password, but it's a bit of a PITA having to re-configure all wifi devices... and if the uninvited guest broke the password once, they can do it again... Her wifi router/internet connection is provided by the telco, and remotely managed so she can log-on to their telco account's page and remotely change the router's Wifi password, but doesn't have access to the router status page/config/etc unless she opts out of the telco's remote support and mainteinance service... So, how could she check if there are guests in the wifi with this restrictions and in the most "point and click way"? In this case I'd probably use nmap to look for other devices in the network, but I'm not sure if that's the easiest way to do it. I'm not a wifi expert, so I don't know if there are any wifi-scanning utils that can tell us who's talking to the router... Lastly, she's a Windows user as I guess that'll influence the choice of tools available Any suggestions more than welcome Regards!

    Read the article

  • Scenario - NTFS Symbolic Link or Junction?

    - by Unsigned
    Differences Absolute Relative File Directory UNC Symbolic link ? ? ? ? ? Junction ? x x ? x Scenario Let's assume we're creating a reparse point to create the redirect C:\SomeDir => D:\SomeDir Since this scenario only requires local, absolute paths, either a junction or symlink would work. In this situation, is there any advantage to using one or the other? Assume Windows 7 for the OS, disregarding backward-compatibility (prior to Vista, symlinks are not supported). Update I have found another difference. Symbolic Link - Link's permissions only affect delete/rename operations on the link itself, read/write access (to the target) is governed by the target's permissions Junction - Junction's permissions affect enumeration, revoking permissions on the junction will deny file listing through that junction, even if the target folder has more permissive ACLs The permissions make it interesting, as symlinks can allow legacy applications to access configuration files in UAC-restricted areas (such as %ProgramFiles%) without changing existing access permissions, by storing the files in a non-restricted location and creating symlinks in the restricted directory.

    Read the article

  • Why am I experiencing random connection timeouts? (CentOS)

    - by Ryan
    I have a CentOS server setup that currently hosts several websites (all relative of each other in some form or another). As of recently throughout the day at the most random times the website speed will lag to a crawl and eventually hit a connection timeout. When I say random times this typically happens anywhere between 10am and 1pm usually, however, this morning this happened to me at 8am. I do not have a lot of familiarity with server knowledge as far as what I am looking for in this situation. What are some possible causes of why my server is slowing the websites down to a complete crawl or timing out? Are there specific things I should be checking for when this happens? I have noticed using: tail /var/log/httpd/access_log That usually when this down time occurs there are lot of IP addresses related to BingBot, Googlebot, and sometimes various bots or spiders that I am unfamiliar with. Could this be related and if so how can I avoid this from causing my websites to lag out? Thanks in advance for any help or advice. The websites that are timing out are built with PHP and use a MySQL database to display information.

    Read the article

  • Formatting pwd/ls for use with scp

    - by eumiro
    I have two terminal windows with bash. One is local on the client computer, another one has an SSH-session on the server. On the server, I am in a directory and seeing a file I would like to copy to my client using scp from the client. On the server I see: user@server:/path$ ls filename filename I can now type scp in the client shell, select and copy the user@server:/path from the server shell and paste to the client shell, then type slash and copy and paste the filename and append a dot to get: user@client:~$ scp user@server:/path/filename . to scp a file from the server to the client. Now I am searching for a command on the server, that would work like this: user@server:/path$ special_ls filename user@server:/path/filename which would give me the complete scp-ready string to copy&paste to the client shell. Something in the form echo $USER@$HOSTNAME:${pwd}/$filename working with relative/absolute paths. Is there any such command/switch combination or do I have to hack it myself? Thank you very much.

    Read the article

  • Intel Motherboard Lightning Victim Dies Hard

    - by Stetson RDT
    Today, I have a more hardware-related question. I have an Intel board, and I really do not know which board it is, I built the machine for a relative, but he forgot to keep the documentation. Long story short, the computer was disconnected during a lightning storm, but a lightning strike travelled in via the ethernet cable (It was directly connected to a power brick commonly seen on those long distance ISP Wireless transmitters), and the motherboard was shocked. I am attempting to get this PC going. The problem is as follows: The computer will randomly reboot, just in the middle of anything as it pleases. May load to EFI (or whatever BIOS is nowadays), may load to bootloader, may even get to the OS. But before 5 minutes is up, the system will always die. Out of curiosity, I plugged my voltmeter in to a molex connector. On the 5V side, it gets a good, consistant +5.13V. On the 12V side, it fluctuates, as follows: Upon immediate startup, it soars to 12.11-12.13V. It will now do one of two things: it will immediately jump down to 12.04-12.05V, or hover for about a minute at 12.11-12.13, then jump down. It seems the longer the voltage stays at 12.11-12.13, the shorter the machine will stay running. Also, post codes, whenever the machine locks up, but does not die hard, seem to be between "AA" and "AC". Does this make any sense to anybody? Do you all think this motherboard is salvageable? It was an expensive bugger, and I'd prefer to not replace it.

    Read the article

  • nVidia performance with newer X and newer driver abysmal with Compiz

    - by Nakedible
    I recently upgraded Debian to Xorg 2.9.4 and installed nvidia-glx from experimental, version 260.19.21. This was somewhat of an uphill battle as the dependencies for the experimental nvidia-glx package are still somewhat broken. I got it to work without forcing the installation of any packages and without modifying the packages. However, after the upgrade compiz performance has been abysmal. I am using the desktop wall plugin and switching viewports is really slow - takes a few seconds for each switch. In addition to this, every effect that compiz does, such as zoom animations for icons when launching applications, takes seconds. The viewport switching speed changes relative to the amount of windows on that virtual screen - empty screens switch almost at normal speed, single browser windows work almost decently, but just 4 rxvt terminals slows the switches down to a crawl. My compiz configuration should be pretty basic. Xorg is likewise configured without anything special - the only "custom" configuration is forcing the driver name to be "nvidia". I've fiddled around with the nvidia-settings and compizconfig trying different VSync settings, but none of those helped. My graphics card is: NVIDIA GPU NVS 3100M (GT218) at PCI:1:0:0 (GPU-0). This is laptop GPU that is from the Geforce GTX 200 series. Graphics card performance should naturally be no problem. EDIT: In the end, nothing really worked, and I got really annoyed with the state of compiz and its support in Debian. Many nVidia driver revisions have passed and I am using Gnome 3 now, so I am accepting the best answers to this question even though the issue was not resolved.

    Read the article

  • Is my motherboard failing, or is there some other issue?

    - by ThatGuy
    So, several months ago I put together my own desktop PC. I set up a dual boot to Windows and Ubuntu. Recently, without changing any settings or installing anything new, the wifi stopped working on windows (I use a wifi adapter). It said it was connected, Network settings showed that it was working and running trouble shooting had no results. My internet still works on any other device. I found that removing the adapter from the motherboard and plugging it back in was the only thing that fixed the problem. Reinstalling the wifi drivers did not help. I purchased a new Wifi adapter, but the problem persists. More recently, I had a much more discouraging development. Sometimes, turning on the computer results in a boot loop: BIOS never starts. Instead, the monitor turns on as if it got a signal, then immediately turns off. This loops on it's own indefinitely until I hold down power, hard reset it, and try again. Sometimes it works, sometimes it doesn't. I haven't tested much on the Ubuntu side. It appears that wifi works at least some of the time, but since I've had issues just getting to BIOS I'm not confident the issue is on the software side. I've also noticed issues with some of the USB ports no longer working, but that seems to be off and on. Finally, as of a few minutes ago, I booted to windows to discover that everything was running very slowly. Slow here is a relative word, but I have a Samsung 840 pro SSD and I'm used to applications running nigh instantly, and it was a solid 3 minutes before any of my applications would load. Anyway, my question is this: Is it likely that my motherboard is failing? Either way, what steps can I take to try and pin down the problem and figure out what to do?

    Read the article

  • Manual Http error response code in non-existent folder via routing

    - by Slytherin
    Apache server running on ubuntu-like linux I am getting unexpected behaviour when i try to manually send error response. If my .htaccess is responsible for the error response , then appropriate error document is loaded and displayed , with according response code in browser console. However , if my router is origin of the response code , then i get blank screen , but correct response code. .htaccess looks like this RewriteEngine On # RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule !\.(css|js|icon|zip|rar|png|jpg|gif|pdf)$ index.php [L] ErrorDocument 404 /err/404.html ErrorDocument 403 /err/403.html ErrorDocument 500 /err/500.html part of my router that sends the response is the following header("HTTP/1.1 403 Forbidden"); trying this format didnt help either header("HTTP/1.1 403 Forbidden", TRUE, 403); I also tried HTTP/1.0. Furthermore i was thinking that maybe relative path to error page might be an issue , but discarded this idea after attempting to access a document that is forbidden via .htaccess EDIT I should also point out , this scenario happens when URL for not-existing article is requested. Is it possible that Server is looking for a .htaccess file in a folder based on URL ? Eg: domain/blog/non-existent , is server looking for blog folder ? I am specifically asking this because there is no blog folder

    Read the article

  • Apache Virtualhost entry with Windows hostname

    - by gshauger
    I have a Windows Domain Controller and we use it for DNS for our internal network. I have an Ubuntu box with an IP address of 172.16.34.149. Within the Windows DNS I created the forward and reverse lookup entries for the name Endymion. Naturally when ever I FTP/SSH/HTTP/etc to the hostname Endymion it resolves correctly to my Ubuntu box. I wanted to do some web development on this box for an existing site. There were problems when I placed the website in a subfolder of /var/www/. Let's just say it was in folder /var/www/projectx/. The issue involved the incorrect resolution of non-relative urls. So I figure I could create a new DNS entry for the hostname projectx. Sure enough when I FTP/SSH/HTTP/etc to the hostname projectx it takes me to the same ubuntu box as the hostname Endymion...this is what I would expect. I now have two hostnames for the same box. I then create a Virtualhost entry in httpd.conf that looks like the following: <VirtualHost *:80> DocumentRoot /var/www/projectx ServerName projectx ServerAlias projectx </VirtualHost> Sure enough when I go to a browser and type in http://projectx/ it takes me to the correct subfolder. Everything works!!! Not so fast. I then go to http://endymion/ and instead of taking me to /var/www/ it takes me to /var/www/projectx/ Clearly I'm missing something. Help please! ;)

    Read the article

  • Which components should I invest in.. for a backup machine.

    - by Senthil
    I am a freelance developer. I have a PC, a laptop and an old testing and file server machine. I might add one or two in future. I want to have an on-site backup machine that can handle backups of ALL these machines - file backups, MySQL backups, backup of subversion repository, etc.. When building the machine, which components should I invest more in? Examples: The cabinet should have lots of room for expansion. Hard disk size should be large. But I guess hard disk speed need not be high (?) But other components like, RAM, PSU, Processor, Network card, Cooling, etc.. how much relative importance do these have in a backup machine? Which of these components should be high-end or large, and which ones need not be? Some Idea of the load: There will TBs of data. File backups and subversion repository backups will at least be done daily. MySQL backups done weekly. assume 3 machines at the moment and somewhere around 10 machines in the future.

    Read the article

  • Efficient mirroring of directories using hardlinks

    - by zoqaeski
    I'm backing up my music collection on to a number of NTFS-formatted external hard-drives; however, as I store my main collection in FLAC and have my library on my laptop as MP3s to save space, I want to be able to back up both sets, because mass conversion between formats is time-consuming. The "music" directory can contain any format; the "mp3s" directory contains only MP3s converted from files in the "music" directory. The music collection on the laptop contains only MP3s, but they come from both sources. When I backup my laptop's library to the "mp3s" directory, I want to only copy across MP3 files that don't exist in the "music" directory; those that do should be hard-linked to the "music" directory. All directories have an identical hierarchy, sorted by artist, album, date, discnumber if applicable, etc, and I use a tagging editor to ensure consistency across all these locations. I'm also using a Linux computer, but keeping the music collections on NTFS-formatted partitions so that they are readable by both Linux and Windows. At the moment, I use the following command to perform the backups, but this is time-consuming due to the expensive nature of finding hard links. rsync -avu --progress --relative --ignore-existing --link-dest=../music/ **/*.mp3 /media/ntfspocket/mp3s Is there a way to perform this backup more efficiently, taking advantage of the directory hierarchy?

    Read the article

  • Shortcuts that make using Windows easier? [closed]

    - by ekaj
    Over the years I have found out a good bit of shortcuts, all of which make using Windows that much faster and easier. I was wondering if I had missed any important ones, or ones that just saved any time. I tried to keep these relative to Windows only, as programs such as Mozilla, Photoshop, etc. bring in hundreds of other shortcuts, which do not apply to all users. These are the current ones I know: For the mouse: Scroll wheel - opens a link in IE into a new tab - closes a specific tab in IE when closed on Right click-n-drag - nifty menu to make a copy of a file or create a shortcut For the keyboard: Cntrl + Alt + Del - need this be explained? WinKey + R - opens 'Run..' WinKey + D - shows the desktop WinKey + E - opens 'My Computer' WinKey + F - opens 'Find..' WinKey + Tab - cool way to switch inbetween windows WinKey + L - locks the computer Alt + Tab - switches windows with a simple interface Alt + F4 - closes windows Alt + F - opens 'File' (Handy for things like IE if you have the menubar disabled) Cntrl + F - find text on current document Cntrl + W - closes a window, or current active tab in IE (or IE itself if only one tab) Cntrl + C - copies selected text / image Cntrl + V - pastes selected text / image So does anyone know of any more shortcuts that just make life easier? I would be much appreciative of any, and I am sure that some other users would like to know some too =]

    Read the article

  • Efficient mirroring of directories using hard links [closed]

    - by zoqaeski
    I'm backing up my music collection on to a number of NTFS-formatted external hard-drives; however, as I store my main collection in FLAC and have my library on my laptop as MP3s to save space, I want to be able to back up both sets, because mass conversion between formats is time-consuming. The "music" directory can contain any format; the "mp3s" directory contains only MP3s converted from files in the "music" directory. The music collection on the laptop contains only MP3s, but they come from both sources. When I backup my laptop's library to the "mp3s" directory, I want to only copy across MP3 files that don't exist in the "music" directory; those that do should be hard-linked to the "music" directory. All directories have an identical hierarchy, sorted by artist, album, date, discnumber if applicable, etc, and I use a tagging editor to ensure consistency across all these locations. I'm also using a Linux computer, but keeping the music collections on NTFS-formatted partitions so that they are readable by both Linux and Windows. At the moment, I use the following command to perform the backups, but this is time-consuming due to the expensive nature of finding hard links. rsync -avu --progress --relative --ignore-existing --link-dest=../music/ **/*.mp3 /media/ntfspocket/mp3s Is there a way to perform this backup more efficiently, taking advantage of the directory hierarchy?

    Read the article

  • SSH connect error

    - by DMDGeeker
    I use a notebook with Ubuntu 12.10 and try to connect a server with Ubuntu 12.04. The server has already installed openssh-server. And allow publick key and password to login. But I connect the server sometime well but after minutes it will be error. First, it will show me these messages:    WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!    ......    Add correct host key in /home/myname/.ssh/known_hosts to get rid of this message. but I never reinstall system and the openssh-server. Both are never changed! The server nerver shutdown or reboot. Second, after I remove the relative key from my known_hosts and use ssh connect the server again, it will let me type my password. then my nightmare coming... Permission denied (publickey,password) But I typed the correct password! PS: I used password and public key both success. But the problem will appear again after i logout then login.

    Read the article

  • yui compressor maven plugin doesnt compress the js files

    - by hanumant
    I am using yui compressor to compress the js files in my web app. I have configured the plugin as indicated on yui maven plugin site yui compressor maven plugin. This is the pom plugin conf <plugin> <groupId>net.sf.alchim</groupId> <artifactId>yuicompressor-maven-plugin</artifactId> <version>0.7.1</version> <executions> <execution> <phase>compile</phase> <goals> <goal>jslint</goal> <goal>compress</goal> </goals> </execution> </executions> <configuration> <failOnWarning>true</failOnWarning> <nosuffix>true</nosuffix> <force>true</force> <aggregations> <aggregation> <!-- remove files after aggregation (default: false) --> <removeIncluded>false</removeIncluded> <!-- insert new line after each concatenation (default: false) --> <insertNewLine>false</insertNewLine> <output>${project.basedir}/${webcontent.dir}/js/compressedAll.js</output> <!-- files to include, path relative to output's directory or absolute path--> <!--inputDir>base directory for non absolute includes, default to parent dir of output</inputDir--> <includes> <include>**/autocomplete.js</include> <include>**/calendar.js</include> <include>**/dialogs.js</include> <include>**/download.js</include> <include>**/folding.js</include> <include>**/jquery-1.4.2.min.js</include> <include>**/jquery.bgiframe.min.js</include> <include>**/jquery.loadmask.js</include> <include>**/jquery.printelement-1.1.js</include> <include>**/jquery.tablesorter.mod.js</include> <include>**/jquery.tablesorter.pager.js</include> <include>**/jquery.dialogs.plugin.js</include> <include>**/jquery.ui.autocomplete.js</include> <include>**/jquery.validate.js</include> <include>**/jquery-ui-1.8.custom.min.js</include> <include>**/languageDropdown.js</include> <include>**/messages.js</include> <include>**/print.js</include> <include>**/tables.js</include> <include>**/tabs.js</include> <include>**/uwTooltip.js</include> </includes> <!-- files to exclude, path relative to output's directory--> </aggregation> </aggregations> </configuration> <dependencies> <dependency> <groupId>rhino</groupId> <artifactId>js</artifactId> <scope>compile</scope> <version>1.6R5</version> </dependency> <dependency> <groupId>org.apache.maven</groupId> <artifactId>maven-plugin-api</artifactId> <version>2.0.7</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.maven</groupId> <artifactId>maven-project</artifactId> <version>2.0.7</version> <scope>provided</scope> </dependency><dependency> <groupId>net.sf.retrotranslator</groupId> <artifactId>retrotranslator-runtime</artifactId> <version>1.2.9</version> <scope>runtime</scope> </dependency> </dependencies> </plugin> And here is the log at compress time These will use the artifact files already in the core ClassRealm instead, to allow them to be included in PluginDescriptor.getArtifacts(). [DEBUG] Configuring mojo 'net.sf.alchim:yuicompressor-maven-plugin:0.7.1:jslint' [DEBUG] (f) failOnWarning = true [DEBUG] (f) jswarn = true [DEBUG] (f) outputDirectory = C:\test\target\classes [DEBUG] (f) project = MavenProject: com.test.test1:test2:19-SNAPSHOT @ C:\test\pom.xml [DEBUG] (f) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\test\src, PatternSet [includes: {}, excludes: {**/*.class, **/*.java, site/*}]}}] [DEBUG] (f) sourceDirectory = C:\test\src\..\js [DEBUG] (f) warSourceDirectory = C:\test\src\main\webapp [DEBUG] (f) webappDirectory = C:\test\target\test2-19-SNAPSHOT [DEBUG] -- end configuration -- [INFO] [yuicompressor:jslint {execution: default}] [INFO] nb warnings: 0, nb errors: 0 [DEBUG] Configuring mojo 'net.sf.alchim:yuicompressor-maven-plugin:0.7.1:compress' -- [DEBUG] (f) removeIncluded = false [DEBUG] (f) insertNewLine = false [DEBUG] (f) output = C:\test\WebContent\js\compressedAll.js [DEBUG] (f) includes = [**/autocomplete.js, **/calendar.js, **/dialogs.js, **/download.js, **/folding.js, **/jquery-1.4.2.min.js, **/jquery.bgifram e.min.js, **/jquery.loadmask.js, **/jquery.printelement-1.1.js, **/jquery.tablesorter.mod.js, **/jquery.tablesorter.pager.js, **/jquery.dialogs.p lugin.js, **/jquery.ui.autocomplete.js, **/jquery.validate.js, **/jquery-ui-1.8.custom.min.js, **/languageDropdown.js, **/messages.js, **/print.js, * */tables.js, **/tabs.js, **/uwTooltip.js] [DEBUG] (f) aggregations = [net.sf.alchim.mojo.yuicompressor.Aggregation@65646564] [DEBUG] (f) disableOptimizations = false [DEBUG] (f) encoding = Cp1252 [DEBUG] (f) failOnWarning = true [DEBUG] (f) force = true [DEBUG] (f) gzip = false [DEBUG] (f) jswarn = true [DEBUG] (f) linebreakpos = 0 [DEBUG] (f) nomunge = false [DEBUG] (f) nosuffix = true [DEBUG] (f) outputDirectory = C:\test\target\classes [DEBUG] (f) preserveAllSemiColons = false [DEBUG] (f) project = MavenProject: com.test.test1:test2:19-SNAPSHOT @ C:\test\pom.xml [DEBUG] (f) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\test\src, PatternSet [includes: {}, excludes: {**/*.class, **/*.java, site/*}]}}] [DEBUG] (f) sourceDirectory = C:\test\src\..\js [DEBUG] (f) statistics = true [DEBUG] (f) suffix = -min [DEBUG] (f) warSourceDirectory = C:\test\src\main\webapp [DEBUG] (f) webappDirectory = C:\test\target\test2-19-SNAPSHOT [DEBUG] -- end configuration -- [INFO] [yuicompressor:compress {execution: default}] [INFO] generate aggregation : C:\test\WebContent\js\compressedAll.js [INFO] compressedAll.js (407505b) [INFO] nb warnings: 0, nb errors: 0 [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-resources-plugin:2.2:testResources' -- [DEBUG] (f) filters = [] [DEBUG] (f) outputDirectory = C:\test\target\test-classes [DEBUG] (f) project = MavenProject: com.test.test1:test2:19-SNAPSHOT @ C:\test\pom.xml [DEBUG] (f) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\test\test , PatternSet [includes: {}, excludes: {**/*.class, **/*.java}]}}] [DEBUG] -- end configuration -- The problem is the files are getting aggregated into one file but without compressing. The link above uses version 1.1 and the plugin version I am using is 0.7.1. Is this going to make any diff. Can someone tell what is wrong here. PS: I have obfuscated some text in log to follow the compliance in my firm. So you may find it mismatching somewhere.

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >