Search Results

Search found 40294 results on 1612 pages for 'dot files'.

Page 241/1612 | < Previous Page | 237 238 239 240 241 242 243 244 245 246 247 248  | Next Page >

  • Ping Unknown Host on CentOS at EC2

    - by organicveggie
    Weird problem. We have a collection of servers running CentOS 5 on EC2. The setup includes two DNS servers and two LDAP servers. DNS has a CNAME pointing at the primary LDAP server. One machine (and only one machine) is giving me problems. I can ssh into the server using LDAP authentication. But once I'm on the machine, ping won't resolve the LDAP host even though DNS seems to work fine. Here's ping: $ ping ldap.mycompany.ec2 ping: unknown host ldap.mycompany.ec2 Here's the output of dig: $ dig ldap.mycompany.ec2 ; <<>> DiG 9.3.6-P1-RedHat-9.3.6-4.P1.el5_5.3 <<>> ldap.studyblue.ec2 ;; global options: printcmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 2893 ;; flags: qr aa rd ra; QUERY: 1, ANSWER: 2, AUTHORITY: 0, ADDITIONAL: 0 ;; QUESTION SECTION: ;ldap.mycompany.ec2. IN A ;; ANSWER SECTION: ldap.mycompany.ec2. 3600 IN CNAME ec2-hostname.compute-1.amazonaws.com. ec2-hostname.compute-1.amazonaws.com. 55 IN A aaa.bbb.ccc.ddd ;; Query time: 12 msec ;; SERVER: 10.32.159.xxx#53(10.32.159.xxx) ;; WHEN: Tue May 31 11:16:30 2011 ;; MSG SIZE rcvd: 107 And here is resolv.conf: $ cat /etc/resolv.conf search mycompany.ec2 nameserver 10.32.159.xxx nameserver 10.244.19.yyy And here is my hosts file: $ cat /etc/hosts 10.122.15.zzz bamboo4 bamboo4.mycompany.ec2 127.0.0.1 localhost localhost.localdomain And here's nsswitch.conf $ cat /etc/nsswitch.conf passwd: files ldap shadow: files ldap group: files ldap sudoers: ldap files hosts: files dns bootparams: nisplus [NOTFOUND=return] files ethers: files netmasks: files networks: files protocols: files rpc: files services: files netgroup: files ldap publickey: nisplus automount: files ldap aliases: files nisplus So DNS works the way I would expect. And I can ping the ldap server by ip address. And I can even access the box with SSH using LDAP authentication. Any suggestions?

    Read the article

  • Backing up files on ubuntu for reinstall. Will there be problems with permissions?

    - by adam
    I have some very important files I want to backup before I reinstall my Ubuntu back to 9.04 from the 9.10 (its causing me all sorts of problems). The files total size is small so im just going to copy them over to Dropbox. Im wondering, when i reinstall Ubuntu and copy them back will there be any issues re the permissions of those files because my old user account which created them and the new user Ill setup on the new install will be different?

    Read the article

  • What to do with ca.crt, name.crt, name.key, name.ovpn files?

    - by tipu
    I was given these four files to access the office's vpn server. I am on ubuntu 12.04, and am unsure how to began using these. I tried using the vpn connection tab under the network connections, but my files didn't specify a username after importing and it forced to me to save one, so attempting to connect to it didn't yield any results. What am I supposed to do with these four files to connect to the vpn?

    Read the article

  • How can I maximally compress .gz files in Nautilus?

    - by Takkat
    When selecting Compress... from the right click context menu in Nautilus I am able to quickly compress files to .gz format. However by default Nautilus does not use maximum compression. Can I make Nautilus to use maximum compression like gzip -9? Using gconftool or gconf-editor to set the compression_level for File Roller to maximum seems right but infortunately has not the desired effect and will not lead to maximum compressed files. As this is the expected way of how to set compression levels a bug report has been filed upstream. Any ideas for a workaround are welcome.

    Read the article

  • Why is it good to have website content files on a separate drive other than system (OS) drive?

    - by Jeffrey
    I am wondering what benefits will give me to move all website content files from the default inetpub directory (C:) to something like D:\wwwroot. By default IIS creates separate application pool for each website and I am using the built-in user and group (IURS) as the authentication method. I’ve made sure each site directory has the appropriate permission settings so I am not sure what benefits I will gain. Some of the environment settings are as below: VMWare Windows 2008 R2 64 IIS 7.5 C:\inetpub\site1 C:\inetpub\site2 Also as this article (moving the iis7 inetpub directory to a different drive) points out, not sure if it's worth the trouble to migrate files to a different drive: PLEASE BE AWARE OF THE FOLLOWING: WINDOWS SERVICING EVENTS (I.E. HOTFIXES AND SERVICE PACKS) WOULD STILL REPLACE FILES IN THE ORIGINAL DIRECTORIES. THE LIKELIHOOD THAT FILES IN THE INETPUB DIRECTORIES HAVE TO BE REPLACED BY SERVICING IS LOW BUT FOR THIS REASON DELETING THE ORIGINAL DIRECTORIES IS NOT POSSIBLE.

    Read the article

  • The DotNetNuke Gallery Module - 7 Video Tutorials

    In this video tutorial we cover the Installation of the DotNetNuke Core Gallery Module and show you how to configure it correctly.We walk you through how to use all of the features in the Gallery module including creating albums, uploading files, bulk uploads, the slideshow, media files, watermarks, templates, and more.The videos contain:Video 1 - Introduction to the DNNGallery Module, Installation and Basic ConfigurationVideo 2 -How to Upload Images and Configure Their SettingsVideo 3 -Creating Gallery Albums and Bulk UploadingVideo 4 - How to Add Files and Albums Using FTP, Adding Music and Changing Permissible Upload TypesVideo 5 - How to Add Video Files, How to Rate Files,Gallery Look and FeelVideo 6 - Changing Feature Settings,Adding Watermarks, Gallery Security RolesVideo 7 - Working withPrivate Galleries and Security Roles, Gallery MaintenanceTotal Time Length: 57minsThe DotNetNuke Gallery Module - 7 Video Tutorials Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • U1 music shows unknown artist, how can I make it recognize m4a files?

    - by bisi
    Hello all, I am in the process of uploading my music library as a backup to U1, but I figured, why not also enjoy Ubuntu 1 music on my iPhone? With a few difficulties to start, the upload is now in progress, but I've noticed that there is a huge percentage of files in the unknown artist folder, and I believe it is all of my m4a files. They play fine, but without any information. Coming from an iTunes background, and having bought the majority of my music on the iTunes store, I wonder how I could make this work, easily? I am on Maverick (afaik), but About Ubuntu shows 11.04. I use Banshee as my music manager, and I monitor my sync using Ubuntu one preferences, ubuntuone-indicator and magicicada. The total file size of my music folder is 38.9GB. Thank you for your help!! And apologies if I couldn't find a thread where this was already covered...

    Read the article

  • Lubuntu 14.04 Problem starting lxsession-default-apps

    - by user278179
    I have one problem, I can't execute lxsession-default-apps on Lubuntu 14.04 because I get because said to me "The database is updating, please wait" If I try to run lxsession-default-apps, I get this error: ** Message: utils.vala:30: config_path_directory: /home/USER/.config/lxsession-default-apps ** Message: desktop-files-backend.vala:171: test config_path: /home/USER/.config/lxsession-default-apps/settings.conf ** Message: desktop-files-backend.vala:237: Scanning folder: /usr/share/applications ** Message: desktop-files-backend.vala:278: Start scanning ** Message: desktop-files-backend.vala:257: Scanning folder: /usr/share/app-install/desktop ** Message: desktop-files-backend.vala:278: Start scanning Error: list_files failed: No such file or directory ** Message: desktop-files-backend.vala:333: Finishing scanning ** Message: desktop-files-backend.vala:189: Signal finish scanning with mode: write ** Message: desktop-files-backend.vala:333: Finishing scanning Any help would be appreciated. Thanks. Regards.

    Read the article

  • How can I drag and drop files into the Ubuntu One bookmark in Nautilus?

    - by coversnail
    I have a bookmark to my Ubuntu-One folder in the sidebar of Nautilus, it would seem logical that it would be possible to drag and drop files and folders on top of this bookmark and have the dropped items copied to the Ubuntu-One folder to be synced to the cloud. However this does not happen, I can drop files and folders onto the shortcuts underneath "Computer" in the Nautilus sidebar but not onto anything that is a bookmark. Is it possible to change this behaviour so that it is possible? Or failing that are there any possible workarounds to get a similar result?

    Read the article

  • Why does Windows Media Center try to open zip files?

    - by gpryatel
    Notes: OS is windows 7, browser is latest firefox. After saving a zip file to the desktop, Windows Media Center opens up. I looked around its config settings but could not find anything related to zip files. How do I turn that off? Also, don't know if this should be a separate question or not: Unless I right click save link as... for zip files, I don't get a firefox dialogue asking what to do with the file (Open/Save). The files get saved to some place like c:\users\namegoeshere\appdata This only happens on the win7 computer. I looked around in firefox's settings for saving files, and I do have 'ask me where to download...' enabled. I can get more exact path names when I get home.

    Read the article

  • Rar.exe CLI - add files from directory without directory structure?

    - by Vercas
    I need to grab every file in a folder and shove it into a RAR archive. This is my current method: "C:\Program Files\WinRAR\Rar.exe" a -r -md2m -s -m5 -ma4 -t ..\Releases\vCommands.rar bin\ ... Where bin is my folder. I tried this too, even though it is for another program, and the results are the same. To be clear, here's a picture: To the top-left, in the .rar file, there is the bin directory which contains all the files. To the bottom-right, in the .7z file, all those files are in the archive root. What I need is shoving all those files in the .rar archive root, instead of a folder, without having to execute my batch file inside that bin folder.

    Read the article

  • Can I use a wildcard to denote subdirectories as opposed to just files in the Windows Command Prompt

    - by Dinosaurus
    I know I can use a wildcard to list the files in a single directory: dir *.java However, does anyone know if it is possible to denote a subdirectory with a wildcard as well? I would like to do something like dir classes/*/*.java Where, it will list all the java files in every subdirectory beneath the classes directory. So, if there is: classes/cs1100/ classes/cs1200/ classes/cs1500/ It will list all the java files within these. Note, I'm not using this specifically for the "Dir" command, but instead another command line tool that accepts a list of files. But, if it works for Dir, it shoudl work in my other program as well.

    Read the article

  • Why is 'libgnomevfs' files under /usr/include/gnome-vfs-2.0?

    - by George Edison
    Most applications, including the gnomevfs headers themselves, expect the files to be under /usr/include/libgnomevfs, but Ubuntu has them under /usr/include/gnome-vfs-2.0/libgnomevfs. Why? The package I'm referring to is called libgnomevfs2. Inside /usr/include/gnome-vfs-2.0/libgnomevfs/gnome-vfs.h` we find: #include <libgnomevfs/gnome-vfs-acl.h> #include <libgnomevfs/gnome-vfs-address.h> #include <libgnomevfs/gnome-vfs-async-ops.h> #include <libgnomevfs/gnome-vfs-cancellation.h> ... Meaning that even the headers themselves expect the files to be in that location - and nothing that includes this file will work. Am I missing something, or is this a glitch?

    Read the article

  • How can I protect files on my NGiNX server?

    - by Jean-Nicolas Boulay Desjardins
    I am trying to protect files on my server (multiple types), with NGiNX and PHP. Basically I want people to have to sign in to the website if they want to access those static files like images. DropBox does it very well. Where by they force you to sign in to access any static files you put on there server. I though about using NGiNX Perl Module. And I would write a perl script that would check the session to see if the user was sign in to give them access to a static file. I would prefer using PHP because all my code is running under PHP and I am not sure how to check a session created by PHP with PERL. So basically my question is: How can I protect static files of any types that would need the user to have sign in and have a valid session created with a PHP script?

    Read the article

  • How can I send super large files directly to another computer in the Internet for free?

    - by Cruise
    I regulary need to transfer very large files (30 GB) to my friend - financial statistics. I don't have any problem with bandwidth: it is very broad here. I did some research in the area, so: 1. I would not use FTP, as it is very tricky to get it working behind a NAT. 2. I would not use Skype/MSN/ICQ, as it is not designed for file transfer and it underperforms on the huge files. 3. I would not use file-sharing services, as I need to pay for big files (30 GB is a problem here) and I don't like holding any piece of my data on the third-party server. So, I need some smart tool that will do what I need: sending files directly browser-to-browser and not browser-server-browser. Is it so complex? Is there some web application in the Internet that can do this?

    Read the article

  • Why some recovery tools are still able to find deleted files after I purge Recycle Bin, defrag the disk and zero-fill free space?

    - by Ivan
    As far as I understand, when I delete (without using Recycle Bin) a file, its record is removed from the file system table of contents (FAT/MFT/etc...) but the values of the disk sectors which were occupied by the file remain intact until these sectors are reused to write something else. When I use some sort of erased files recovery tool, it reads those sectors directly and tries to build up the original file. In this case, what I can't understand is why recovery tools are still able to find deleted files (with reduced chance of rebuilding them though) after I defragment the drive and overwrite all the free space with zeros. Can you explain this? I thought zero-overwritten deleted files can be only found by means of some special forensic lab magnetic scan hardware and those complex wiping algorithms (overwriting free space multiple times with random and non-random patterns) only make sense to prevent such a physical scan to succeed, but practically it seems that plain zero-fill is not enough to wipe all the tracks of deleted files. How can this be?

    Read the article

  • On linux how can make a list of files that are owned by a particular owner and then fix the group and owner?

    - by Stuart Woodward
    I have a deep and complex file system where some files have been accidently written by root. I want to change the ownership of those files back to the original owner in one go. I am playing with commands like: find /folder -type f | xargs ls -l | grep "root root" but there is a lot of garbage coming out too. I want to make a list first and then change only the files in that list after confirmation.

    Read the article

  • How to read data from a large number of files in a folder? [closed]

    - by Gary Dhillon
    I seem to be having some trouble figuring out a solution for a problem. See the thing is, my code is supposed to read a lot of data from a bunch of files. I've been thinking of two different approaches: 1) the first one seems simpler, I ask the user if they would like to examine the next file or just quit out of the program.( I believe this is simpler and would take less time to run through.) 2)It reads through all the files and outputs the results for each of them, and then a shared result for all of them.( I think this would be better for what I've been asked to do and it saves the user some hassle.) If anyone can tell me how to code either of these in C++, I would be very grateful. Here is a sample of the file: 0 -- 19 weight 0 -- 20 weight I use this to determine density and possibly ignore the weights which is a number.

    Read the article

  • What Scripting Program would you choose to recover deleted and missing files?

    - by Steven Graf
    For a private project I'm looking for a command line tool to scan and recover files. I'm working on Gnome 3 (but I could also change my OS if it helps reaching my goal) and must be able to find and recover files on attached devices with formats such as NTFS, Fat32, MAC OS Extended and ext3. Is there a command line script to cover all of them or do I need to use different programs to reach my goal? can you recommend command line tools for these kind of tasks? is one of you willing and able to show me some examples and teach me further?

    Read the article

  • Can I make Windows to open Excel XML files with Excel without opening Explorer?

    - by Sorin Sbarnea
    I want to be able to open Excel XML files in Excel but without assigning XML directly to Excel. There are lots of XML files that are not Excel files and I don't want to open all of them in Excel. The file has proper header for opening in Excel but currently it does open Internet Explorer that asks me if I want to open the file with Excel, save or cancel. I just want to open it without two another annoying windows.

    Read the article

  • Is there any method of backing up Google Drive files in some sort of versioning system?

    - by VictorKilo
    Backstory My company is utilizing Google Drive for our shared files. Each user has their own Drive account. In addition, we have a corporate Drive account which holds documents which are shared to each user. Each folder is shared to different users depending on their permissions and positions in the company. Many users are able to add files, and updated folders within this shared Drive account. This is fine. What is not fine, is when someone deletes something that they shouldn't. I have little to no way of knowing when I file is deleted wrongfully. Furthermore, anything that gets deleted goes into the trash bin of the file's creator, so I can't just restore it from the trash. Question Is there any method of backing up Google Drive files in some sort of versioning system that would allow me to revert files back to defined points in time? What i have Tried I currently have this corporate drive account synced up to my personal computer through the Google Drive application. Each night, I run a backup on the file using Windows "Backup and Restore." This allows me to at least get back files that are lost, but I a cleaner method than this. It's very possible that I may not have the very latest version of a document on my computer when the utility runs.

    Read the article

  • DotNetNuke Gallery Module - 7 Videos

    In this tutorial we cover the Installation of the DotNetNuke Core Gallery Module and show you how to configure it correctly. We walk you through how to use all of the features in the Gallery module including creating albums, uploading files, bulk uploads, the slideshow, media files, watermarks, templates, and more. The videos contain: Video 1 - Introduction to the DNN Gallery Module, Installation and Basic Configuration Video 2 - How to Upload Images and Configure Their Settings Video 3 - Creating Gallery Albums and Bulk Uploading Video 4 - How to Add Files and Albums Using FTP, Adding Music and Changing Permissible Upload Types Video 5 - How to Add Video Files, How to Rate Files, Gallery Look and Feel Video 6 - Changing Feature Settings, Adding Watermarks, Gallery Security Roles Video 7 - Working with Private Galleries and Security Roles, Gallery Maintenance Total Time Length: 57minsDid you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Possible to recover older, previously deleted files with R-studio?

    - by SteveO
    The files and directories on one of my ntfs partition were wiped out last time. I used R-studio to scan the partition, and it did find many files, actually more than the capacity of the partition. This is because R-studio found files that were deleted even earlier. So I wonder if it is possible to specify those files and directories deleted last time instead of those deleted earlier for recovery? R-studio has a free demo version, for which scanning is free,but recovery isn't. It is downloadable from http://www.data-recovery-software.net/Data_Recovery_Download.shtml Its manual is here http://www.r-tt.com/downloads/Recovery_Manual.pdf. I have tried my best to search for answers in the manual, but failed to find one. Their technical support is not as good as their software, and helpless usually in my opinion. Thanks!

    Read the article

  • Team Foundation Server– Debug symbols(pdb files) generated in Release build? Fix it.

    - by Gopinath
    Yesterday I setup TFS for my .NET playground website to implement continuous integration and deployments. After a successful build I noticed that debug symbols(pdb files) were generated even though TFS is configured to build in Release mode.  After a bit of analysis its turned out to be the behavior of TFS to generate debug symbols (pdb files) until we pass the attribute DebugType = None. Here are the steps to pass DebugType parameter to MSBuild of TFS Go to Team Explorer Select Build Defintion >> Edit Build Definition Switch to Process tab Navigate to Advanced Section and locate MSBuild Arguments Add the following: /p:Configuration=Release /p:DebugType=none

    Read the article

  • Find which files an apache process is writing to?

    - by Haluk
    We have this apache process which becomes io-bound time to time. Using atop, we can see it is a write operation. Using lsof -p <PID> we can see a list of files open by the httpd process. First we thought "log" files must be the problem. So we turned them off just to test. However write operations still continues. We will continue testing a few other things. For instance we use php session variables a lot. Maybe php session files are getting all the writing. But is there a way to quickly identify files which get written to by the httpd process? This way we can focus our efforts on those files. UPDATE: We used the strace command as suggested. Here are two lines from the output. write(23, "\27\0\0\0\3SET CHARACTER SET utf8", 27) = 27 write(23, "\17\0\0\0\3SET NAMES utf8", 19) = 19 We do not have a mysql process on this server. So is strace also showing what is being written to an ethernet port? UPDATE2: During high io load, the process which consumes most of the write resources gives the following output to strace -e trace=write -p <PID>: --- SIGCHLD (Child exited) @ 0 (0) --- write(9, "!", 1) = 1 write(19, "OPTIONS * HTTP/1.0\r\nUser-Agent: Apache (internal dummy connection)\r\n\r\n", 70) = 70 However I cannot figure out where these are being written to.

    Read the article

< Previous Page | 237 238 239 240 241 242 243 244 245 246 247 248  | Next Page >