Search Results

Search found 40229 results on 1610 pages for 'deleted files'.

Page 3/1610 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • 12.04 run from cd, cannot copy files from mounted disk

    - by user75122
    I am running 12.04 live from cd and trying to copy some important files from a mounted hard disk to an external disk. (My previous installation of 10.04 crashed and I want to install 12.04, and back up some data before that). When I try to copy files from the mounted disk to the external one, I get the following error: There was an error copying the file into /media/New Volume/L300_Bkp_2012_06_04/pics Error opening file: Permission denied Is this related to the source(mounted hard disk) or the target (external disk)? How do I get around this?

    Read the article

  • Handling deleted users - separate or same table?

    - by Alan Beats
    The scenario is that I've got an expanding set of users, and as time goes by, users will cancel their accounts which we currently mark as 'deleted' (with a flag) in the same table. If users with the same email address (that's how users log in) wish to create a new account, they can signup again, but a NEW account is created. (We have unique ids for every account, so email addresses can be duplicated amongst live and deleted ones). What I've noticed is that all across our system, in the normal course of things we constantly query the users table checking the user is not deleted, whereas what I'm thinking is that we dont need to do that at all...! [Clarification1: by 'constantly querying', I meant that we have queries which are like: '... FROM users WHERE isdeleted="0" AND ...'. For example, we may need to fetch all users registered for all meetings on a particular date, so in THAT query, we also have FROM users WHERE isdeleted="0" - does this make my point clearer?] (1) continue keeping deleted users in the 'main' users table (2) keep deleted users in a separate table (mostly required for historical book-keeping) What are the pros and cons of either approach?

    Read the article

  • What happened to my files?

    - by Ivan Broes
    After a successful upgrade from 11.04 to 11.10 a gradual deterioration occurred -- my laptop became unstable with updates -- I corrected the Aspell file, only to have another appear, I sought but I was blocked out. Had an idea, but resolved one that another problem appeared -- going from bad to worse -- I re-installed windows Vista and Ubuntu 11.10 in the original partitions. Window called it Windows Old and I had no problems recovering my files there - Ubuntu decided it is going to make a new Home directory -- . the questions is where did these files go to after re-installation -- are they deleted? If so That's fine duplicates are in Ubuntu 1 by synchronization - I can only download one file at a time!

    Read the article

  • Why does Files (Nautilus) stopped updating partition's bookmarks?

    - by YuriC
    I've upgraded from 13.04 to 13.10 and noticed that Files (Nautilus) stopped updating my bookmarks that are located in another partition (an ext4 one). It used to work before. Testing, I've found out that, if I add any new bookmark (using CTRL + D, for example), Files then adds this new one and updates all bookmarks, showing that ones that point to my partition. I conclude that the feature (updating bookmarks) works, but it's not being executed when I mount my partition clicking on it. Any hints on how to solve this? Bookmarks really speed up everyday activities.

    Read the article

  • Deleted files still accessible without www in url

    - by phlegma
    I have deleted all files and all hidden files off my server, there is nothing but log files which cannot be deleted. Ironically, files are accessible when nothing is there. Cache cleared, multiple browsers and computers/devices checked. Files show when I exclude "www" from the URL http://sarastringfellow.com/assets/photo/c.jpg http://www.sarastringfellow.com/assets/photo/c.jpg What does this mean?

    Read the article

  • Why I don't use SSIS checkpoint files

    - by jamiet
    In a recent discussion in regard to general ETL best practises the subject of checkpoint files as a means for package restartability came up and I stated that I was dead against using them. For anyone that may care, here is why: Configuring them is distinctly unintuitive (that's a matter of opinion but if you follow the link I'll wager that you will agree) they don't make any allowance for loop iterations they cannot store variables of type Object they are limited in ability. There are many scenarios where you may want to execute certain containers regardless of whether the package is started from a checkpoint file but the current usage model does not allow for this. they are ignored by eventhandlers, which wouldn't be so bad if there were a way to toggle this behaviour in certain scenarios they dont work properly I'll expand on the last bullet point. I have encountered situations where the behaviour for tasks executing concurrently is unpredictable. That is, sometimes the completion of a task that executes concurrently with a failed/failing task will make it into the checkpoint file and sometimes it won't. This is near-impossible to reproduce but it does happen as my good friend John Welch will hopefully concur (if he is reading). Is anyone out there making successful use of checkpoint files within SSIS? I would be interested in knowing about that if so. @Jamiet

    Read the article

  • How to serve static files for multiple Django projects via nginx to same domain

    - by thanley
    I am trying to setup my nginx conf so that I can serve the relevant files for my multiple Django projects. Ultimately I want each app to be available at www.example.com/app1, www.example.com/app2 etc. They all serve static files from a 'static-files' directory located in their respective project root. The project structure: Home Ubuntu Web www.example.com ref logs app app1 app1 static bower_components templatetags app1_project templates static-files app2 app2 static templates templatetags app2_project static-files app3 tests templates static-files static app3_project app3 venv When I use the conf below, there are no problems for serving the static-files for the app that I designate in the /static/ location. I can also access the different apps found at their locations. However, I cannot figure out how to serve all of the static files for all the apps at the same time. I have looked into using the 'try_files' command for the static location, but cannot figure out how to see if it is working or not. Nginx Conf - Only serving static files for one app: server { listen 80; server_name example.com; server_name www.example.com; access_log /home/ubuntu/web/www.example.com/logs/access.log; error_log /home/ubuntu/web/www.example.com/logs/error.log; root /home/ubuntu/web/www.example.com/; location /static/ { alias /home/ubuntu/web/www.example.com/app/app1/static-files/; } location /media/ { alias /home/ubuntu/web/www.example.com/media/; } location /app1/ { include uwsgi_params; uwsgi_param SCRIPT_NAME /app1; uwsgi_modifier1 30; uwsgi_pass unix:///home/ubuntu/web/www.example.com/app1.sock; } location /app2/ { include uwsgi_params; uwsgi_param SCRIPT_NAME /app2; uwsgi_modifier1 30; uwsgi_pass unix:///home/ubuntu/web/www.example.com/app2.sock; } location /app3/ { include uwsgi_params; uwsgi_param SCRIPT_NAME /app3; uwsgi_modifier1 30; uwsgi_pass unix:///home/ubuntu/web/www.example.com/app3.sock; } # what to serve if upstream is not available or crashes error_page 400 /static/400.html; error_page 403 /static/403.html; error_page 404 /static/404.html; error_page 500 502 503 504 /static/500.html; # Compression gzip on; gzip_http_version 1.0; gzip_comp_level 5; gzip_proxied any; gzip_min_length 1100; gzip_buffers 16 8k; gzip_types text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript; # Some version of IE 6 don't handle compression well on some mime-types, # so just disable for them gzip_disable "MSIE [1-6].(?!.*SV1)"; # Set a vary header so downstream proxies don't send cached gzipped # content to IE6 gzip_vary on; } Essentially I want to have something like (I know this won't work) location /static/ { alias /home/ubuntu/web/www.example.com/app/app1/static-files/; alias /home/ubuntu/web/www.example.com/app/app2/static-files/; alias /home/ubuntu/web/www.example.com/app/app3/static-files/; } or (where it can serve the static files based on the uri) location /static/ { try_files $uri $uri/ =404; } So basically, if I use try_files like above, is the problem in my project directory structure? Or am I totally off base on this and I need to put each app in a subdomain instead of going this route? Thanks for any suggestions TLDR: I want to go to: www.example.com/APP_NAME_HERE And have nginx serve the static location: /home/ubuntu/web/www.example.com/app/APP_NAME_HERE/static-files/;

    Read the article

  • Deleted one membership table. Possible to import without breaking relationship?

    - by superexsl
    Hey, I hope this isn't going to be tricky/time consuming, so fingers crossed. I'm working with the ASP.NET membership table. However, I've got quite a few other tables that I've built, and most of them have a relationship with the dbo.aspnet_Membership table. I've accidentally deleted the dbo.aspnet_Membership table and can't get it back. There was no major data on it, (as it's on my local machine), but I would really like to copy and paste that one table from another database I have, mainly for the sake of not breaking the schema. Is this possible? I'm worried if I run the Aspnet_regsql.exe tool, it's going to break the schema and remove all data from the tables as well the relationships (which would take a while to re-establish). Is there any way I can import just the dbo.aspnet_Membership table into my current database? Thanks for the any advice!

    Read the article

  • Where are bluetooth received files saved

    - by Luis Alvarado
    I have used Blueman and the default Bluetooth manager an every time I accept the file transfer from the Phone to the PC, it shows that it is transferring and even notifies me that the transfer was successful but I can not find the image anywhere. Where are files stored after sending them via Bluetooth to the PC? I already checked my Home folder. The Picture folder which I thought it would be the one for images. Then the document folder and afterwards I did a huge check on all folders under /home. No luck. Is nowhere. I even checked this answer with no luck.

    Read the article

  • Which hidden files and directories do I need?

    - by Sammy Black
    In a previous question, I explained my situation/plan: backing up home directory on external drive, reformatting laptop drive, installing 14.04, putting home directory back. (It hasn't happened yet because I can't seem to find the down time, in case things aren't working right away.) It occurred to me that maybe I don't want all of those hidden files and directories (e.g. .local/share/ubuntuone/syncdaemon/, .cache/google-chrome/, etc.) Just judging by the amount of time in copying, I can tell that some of these hidden directories are large. Question: Are there any hidden directories that I obviously don't need/want when I have the laptop running an updated distribution? Will they cause conflicts? (I plan on copying the backed-up directory tree back onto the laptop with the --no-clobber option.)

    Read the article

  • Hiding images in folder without renaming or moving the files

    - by Marcus
    I'm dual booting Ubuntu with Windows and I have all my music on a separate harddrive. In Windows the album art is hidden by default but when I access the folder in Ubuntu there is two album artwork files for every mp3, one small and one large. I would like to hide those images without having to rename them with a dot before or moving them to some other folder becuase then the album artwork would dissapear in Windows. Is there a way to make a .hidden file which hides all images or any other way which hides all images in nautilus?

    Read the article

  • eclipse wont include/identify user .h files and .o files

    - by bks
    i'm new to eclipse CDT, and try to use an existing .o file that was supplied with the proper .h file. eclipse just wont include it in the compilation process. i tryed drag&drop to the eclipse project browser, and the files did show there, but no use, still "No such file: No such file or directory". i tryed defining the .o file in the: project properties tool settings MinGW C Linker Miscellaneous other objects. didn't work either. as for the header file, i did try a workaround: created a new file in the project and named it after the file i want to include as header, then copied the content. and yet, the compiler/linker didn't recognize the object file. perhaps you can help?thank you

    Read the article

  • Lost Windows 7 files

    - by Pader
    My intention was to have a dual boot system with Ubuntu and Windows 7. Obviously I did something wrong because although I had a system menu on booting (is it normal to appear DOS-like?) which gave me an option of booting into windows 7, I was unable to do so. Also, when I booted into Ubuntu, my Windows 7 drive was not available. The Windows 7 drive was an internal 1TB drive partitioned into a 200GB (OS) and a second partition making up the remainder. I was still unable to access this Windows 7 drive even after deleting Ubuntu as I kept getting an 'requires an NTFS drive' error, or something similar. I could not even re-install Windows 7 as the disk was not recognised. I did eventually get the drive back by but I cannot for the life of me remember how. I did try to recover my lost W7 data using Ontrack Easy Recovery (which has always been succesfull in the past for post format recovery) but it would not recognise the 1TB although it was now formatted as NTFS. From other posts on this site, I gather that this is considered a 'Windows 7 Site' problem by Linux users. However, I would dearly love to recover some of my lost Windows 7 files. I had resigned myself to a lot of lost personal data but I happened to notice that a 2TB drive I had connected through a USB docking station had been repartitioned. It must have happened when I installed Ubuntu as I can think of no other explanation. I certainly do not remember consciously requiring Ubuntu to do this. The additional two partitions on the 2TB drive, the original Windows

    Read the article

  • How to Recover that Photo, Picture or File You Deleted Accidentally

    - by The Geek
    Have you ever accidentally deleted a photo on your camera, computer, USB drive, or anywhere else? What you might not know is that you can usually restore those pictures—even from your camera’s memory stick. Windows tries to prevent you from making a big mistake by providing the Recycle Bin, where deleted files hang around for a while—but unfortunately it doesn’t work for external USB drives, USB flash drives, memory sticks, or mapped drives. Luckily there’s another way to recover deleted files. Note: we originally wrote this article a year ago, but we’ve received this question so many times from readers, friends, and families that we’ve polished it up and are republishing it for everybody. So far, everybody has reported success! Latest Features How-To Geek ETC How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image How To Process Camera Raw Without Paying for Adobe Photoshop What is the Internet? From the Today Show January 1994 [Historical Video] Take Screenshots and Edit Them in Chrome and Iron Using Aviary Screen Capture Run Android 3.0 on a Hacked Nook Google Art Project Takes You Inside World Famous Museums Emerald Waves and Moody Skies Wallpaper Change Your MAC Address to Avoid Free Internet Restrictions

    Read the article

  • SQL SERVER – ?Finding Out What Changed in a Deleted Database – Notes from the Field #041

    - by Pinal Dave
    [Note from Pinal]: This is a 41th episode of Notes from the Field series. The real world is full of challenges. When we are reading theory or book, we sometimes do not realize how real world reacts works and that is why we have the series notes from the field, which is extremely popular with developers and DBA. Let us talk about interesting problem of how to figure out what has changed in the DELETED database. Well, you think I am just throwing the words but in reality this kind of problems are making our DBA’s life interesting and in this blog post we have amazing story from Brian Kelley about the same subject. In this episode of the Notes from the Field series database expert Brian Kelley explains a how to find out what has changed in deleted database. Read the experience of Brian in his own words. Sometimes, one of the hardest questions to answer is, “What changed?” A similar question is, “Did anything change other than what we expected to change?” The First Place to Check – Schema Changes History Report: Pinal has recently written on the Schema Changes History report and its requirement for the Default Trace to be enabled. This is always the first place I look when I am trying to answer these questions. There are a couple of obvious limitations with the Schema Changes History report. First, while it reports what changed, when it changed, and who changed it, other than the base DDL operation (CREATE, ALTER, DELETE), it does not present what the changes actually were. This is not something covered by the default trace. Second, the default trace has a fixed size. When it hits that size, the changes begin to overwrite. As a result, if you wait too long, especially on a busy database server, you may find your changes rolled off. But the Database Has Been Deleted! Pinal cited another issue, and that’s the inability to run the Schema Changes History report if the database has been dropped. Thankfully, all is not lost. One thing to remember is that the Schema Changes History report is ultimately driven by the Default Trace. As you may have guess, it’s a trace, like any other database trace. And the Default Trace does write to disk. The trace files are written to the defined LOG directory for that SQL Server instance and have a prefix of log_: Therefore, you can read the trace files like any other. Tip: Copy the files to a working directory. Otherwise, you may occasionally receive a file in use error. With the Default Trace files, if you ask the question early enough, you can see the information for a deleted database just the same as any other database. Testing with a Deleted Database: Here’s a short script that will create a database, create a schema, create an object, and then drop the database. Without the database, you can’t do a standard Schema Changes History report. CREATE DATABASE DeleteMe; GO USE DeleteMe; GO CREATE SCHEMA Test AUTHORIZATION dbo; GO CREATE TABLE Test.Foo (FooID INT); GO USE MASTER; GO DROP DATABASE DeleteMe; GO This sets up the perfect situation where we can’t retrieve the information using the Schema Changes History report but where it’s still available. Finding the Information: I’ve sorted the columns so I can see the Event Subclass, the Start Time, the Database Name, the Object Name, and the Object Type at the front, but otherwise, I’m just looking at the trace files using SQL Profiler. As you can see, the information is definitely there: Therefore, even in the case of a dropped/deleted database, you can still determine who did what and when. You can even determine who dropped the database (loginame is captured). The key is to get the default trace files in a timely manner in order to extract the information. If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Hiding Files in Windows

    - by frbry
    Hello, Currently, I'm developing a system which will extract some files from an SFX archive (files that will be used for another app). I want to make the extracted files hidden, so the person which has find the location of the exe couldn't get the files which will be in same directory with the exe. I know i can apply attrib +h to the files but if the user turns on "show hidden and system files" option in Windows, the files will be visible. Isn't there any method to overcome this? Any suggestion is welcomed. Thanks.

    Read the article

  • Storing image files, psd files, ai files, flash in subversion

    - by nishantcm
    Hi, Can I store large amounts of image files in subversion. My designers usually create these designs and store them anywhere on their pc and there's no system. Can I store the files in an svn repository. That way I can also protect my data against unauthorized access and its also easier to archive. What are your comments and is there any better way to do this? Thanks!

    Read the article

  • using php to list some files in folders

    - by Terix
    I have collected many free themes from around internet. Each of them has a screenshot.jpg or png file on their folder. I want to scan all the folders for that file, and return the full file path to be used with an img html tag. I am not interested on partial path or folders where there are not screenshots. For example, if my fodler structure is: ./a/b/ ./c/d/e/screenshot.jpg ./f/ ./g/screenshot.jpg ./h/i/j/k/ ./l/m/screenshot.png ./n/o/ ./p/screenshot.jpg I want to get: ./c/d/e/screenshot.jpg ./g/screenshot.jpg ./l/m/screenshot.png ./p/screenshot.jpg I managed somehow to get a recursive function, but I figured only the way to return an array and then i can't get rid of what I don't need, and I miss png. Can anyone help me on that? the code I managed to put together is this: function getDirectoryTree( $outerDir , $x){ $dirs = array_diff( scandir( $outerDir ), Array( ".", ".." ) ); $dir_array = Array(); foreach( $dirs as $d ){ if( is_dir($outerDir."/".$d) ){ $dir_array[ $d ] = getDirectoryTree( $outerDir."/".$d , $x); }else{ if ($d==x) $dir_array[ $d ] = $d; } } return $dir_array; } $dirlist = getDirectoryTree('.','screenshot.jpg'); print_r($dirlist);

    Read the article

  • Get access to files on old HD installation and remove system files

    - by Blake
    I have been fooling with ubuntu for only a year or so. Added SSD installed ubuntu(12.04 64bit) connected old drive with ubuntu on it. everything seems to work well, except for access to some files. I would like to do two things: 1) move the swap file from the SSD to the partition on the HD. 2) remove ubuntu system files and gain full access to my other files. I can still remove the SSD and run ubuntu from the HD, so if I am approaching this incorrectly please advise. Thanks in advance Blake

    Read the article

  • Cannot copy files from external hard drive to desktop hard drive in Window 7

    - by Mohammad Reza Selim
    I'm trying to copy some old files from one of my external hard-drives to the hard drive of my desktop PC. Some files can not be copied but giving the error like 'Cannot read from source file or disk'. Those files are videos files (.DAT, .VOB, .MPG) and I watched them all the way through with no issues so the files aren't corrupted. I'm running Windows 7, with admin permissions. Could any one let me know the reason and a solution?

    Read the article

  • INI files or Registry or personal files

    - by Shirish11
    I want to save the configuration of my project. Which includes Screen size Screen Position Folder paths Users settings and so on. The standard places where you can save these are configuration values are: Registry INI files Personal files (like *.cfg) I would like to know how do you choose between these places? Also are there any pros & cons of using any of them?

    Read the article

  • Every file on cPanel got deleted (then restored hours later), and I have no idea why

    - by mcranston18
    I apologize in advance if I don't provide proper detail; I am new to server stuff and am looking for general advice about this issue: I was helping out a client doing web design last month. They have about a dozen static sites on one server. The sites are all built on Joomla, except one which I built on Wordpress. Everything was working fine last month when we did the redesign but all of a sudden this morning, every single file on their server got deleted: every web page, file, and all e-mail addresses. I phoned the hosting company (alliancewww.com) to ask, "why did every single file suddenly delete off the server?" They said, "because someone must have deleted it." I said, "well no one did." (Which I'm pretty damn sure no one did.) They said, "you can pay us to look into the problem." I authorized $150 for them to look into the problem. About an hour later, everything was magically re-instated. The host said they had a back-up of everything and just restored everything. What I'm wondering: Does anyone have recommendations of logs I can go through to investigate how the files got deleted in the first place? I've checked out their cPanel logs but found nothing. Is it likely that this is a mess-up on the host's part?

    Read the article

  • Rkhunter 122 suspect files; do I have a problem?

    - by user276166
    I am new to ubuntu. I am using Xfce Ubuntu 14.04 LTS. I have ran rkhunter a few weeks age and only got a few warnings. The forum said that they were normal. But, this time rkhunter reported 122 warnings. Please advise. casey@Shaman:~$ sudo rkhunter -c [ Rootkit Hunter version 1.4.0 ] Checking system commands... Performing 'strings' command checks Checking 'strings' command [ OK ] Performing 'shared libraries' checks Checking for preloading variables [ None found ] Checking for preloaded libraries [ None found ] Checking LD_LIBRARY_PATH variable [ Not found ] Performing file properties checks Checking for prerequisites [ Warning ] /usr/sbin/adduser [ Warning ] /usr/sbin/chroot [ Warning ] /usr/sbin/cron [ OK ] /usr/sbin/groupadd [ Warning ] /usr/sbin/groupdel [ Warning ] /usr/sbin/groupmod [ Warning ] /usr/sbin/grpck [ Warning ] /usr/sbin/nologin [ Warning ] /usr/sbin/pwck [ Warning ] /usr/sbin/rsyslogd [ Warning ] /usr/sbin/useradd [ Warning ] /usr/sbin/userdel [ Warning ] /usr/sbin/usermod [ Warning ] /usr/sbin/vipw [ Warning ] /usr/bin/awk [ Warning ] /usr/bin/basename [ Warning ] /usr/bin/chattr [ Warning ] /usr/bin/cut [ Warning ] /usr/bin/diff [ Warning ] /usr/bin/dirname [ Warning ] /usr/bin/dpkg [ Warning ] /usr/bin/dpkg-query [ Warning ] /usr/bin/du [ Warning ] /usr/bin/env [ Warning ] /usr/bin/file [ Warning ] /usr/bin/find [ Warning ] /usr/bin/GET [ Warning ] /usr/bin/groups [ Warning ] /usr/bin/head [ Warning ] /usr/bin/id [ Warning ] /usr/bin/killall [ OK ] /usr/bin/last [ Warning ] /usr/bin/lastlog [ Warning ] /usr/bin/ldd [ Warning ] /usr/bin/less [ OK ] /usr/bin/locate [ OK ] /usr/bin/logger [ Warning ] /usr/bin/lsattr [ Warning ] /usr/bin/lsof [ OK ] /usr/bin/mail [ OK ] /usr/bin/md5sum [ Warning ] /usr/bin/mlocate [ OK ] /usr/bin/newgrp [ Warning ] /usr/bin/passwd [ Warning ] /usr/bin/perl [ Warning ] /usr/bin/pgrep [ Warning ] /usr/bin/pkill [ Warning ] /usr/bin/pstree [ OK ] /usr/bin/rkhunter [ OK ] /usr/bin/rpm [ Warning ] /usr/bin/runcon [ Warning ] /usr/bin/sha1sum [ Warning ] /usr/bin/sha224sum [ Warning ] /usr/bin/sha256sum [ Warning ] /usr/bin/sha384sum [ Warning ] /usr/bin/sha512sum [ Warning ] /usr/bin/size [ Warning ] /usr/bin/sort [ Warning ] /usr/bin/stat [ Warning ] /usr/bin/strace [ Warning ] /usr/bin/strings [ Warning ] /usr/bin/sudo [ Warning ] /usr/bin/tail [ Warning ] /usr/bin/test [ Warning ] /usr/bin/top [ Warning ] /usr/bin/touch [ Warning ] /usr/bin/tr [ Warning ] /usr/bin/uniq [ Warning ] /usr/bin/users [ Warning ] /usr/bin/vmstat [ Warning ] /usr/bin/w [ Warning ] /usr/bin/watch [ Warning ] /usr/bin/wc [ Warning ] /usr/bin/wget [ Warning ] /usr/bin/whatis [ Warning ] /usr/bin/whereis [ Warning ] /usr/bin/which [ OK ] /usr/bin/who [ Warning ] /usr/bin/whoami [ Warning ] /usr/bin/unhide.rb [ Warning ] /usr/bin/mawk [ Warning ] /usr/bin/lwp-request [ Warning ] /usr/bin/heirloom-mailx [ OK ] /usr/bin/w.procps [ Warning ] /sbin/depmod [ Warning ] /sbin/fsck [ Warning ] /sbin/ifconfig [ Warning ] /sbin/ifdown [ Warning ] /sbin/ifup [ Warning ] /sbin/init [ Warning ] /sbin/insmod [ Warning ] /sbin/ip [ Warning ] /sbin/lsmod [ Warning ] /sbin/modinfo [ Warning ] /sbin/modprobe [ Warning ] /sbin/rmmod [ Warning ] /sbin/route [ Warning ] /sbin/runlevel [ Warning ] /sbin/sulogin [ Warning ] /sbin/sysctl [ Warning ] /bin/bash [ Warning ] /bin/cat [ Warning ] /bin/chmod [ Warning ] /bin/chown [ Warning ] /bin/cp [ Warning ] /bin/date [ Warning ] /bin/df [ Warning ] /bin/dmesg [ Warning ] /bin/echo [ Warning ] /bin/ed [ OK ] /bin/egrep [ Warning ] /bin/fgrep [ Warning ] /bin/fuser [ OK ] /bin/grep [ Warning ] /bin/ip [ Warning ] /bin/kill [ Warning ] /bin/less [ OK ] /bin/login [ Warning ] /bin/ls [ Warning ] /bin/lsmod [ Warning ] /bin/mktemp [ Warning ] /bin/more [ Warning ] /bin/mount [ Warning ] /bin/mv [ Warning ] /bin/netstat [ Warning ] /bin/ping [ Warning ] /bin/ps [ Warning ] /bin/pwd [ Warning ] /bin/readlink [ Warning ] /bin/sed [ Warning ] /bin/sh [ Warning ] /bin/su [ Warning ] /bin/touch [ Warning ] /bin/uname [ Warning ] /bin/which [ OK ] /bin/kmod [ Warning ] /bin/dash [ Warning ] [Press <ENTER> to continue] Checking for rootkits... Performing check of known rootkit files and directories 55808 Trojan - Variant A [ Not found ] ADM Worm [ Not found ] AjaKit Rootkit [ Not found ] Adore Rootkit [ Not found ] aPa Kit [ Not found ] Apache Worm [ Not found ] Ambient (ark) Rootkit [ Not found ] Balaur Rootkit [ Not found ] BeastKit Rootkit [ Not found ] beX2 Rootkit [ Not found ] BOBKit Rootkit [ Not found ] cb Rootkit [ Not found ] CiNIK Worm (Slapper.B variant) [ Not found ] Danny-Boy's Abuse Kit [ Not found ] Devil RootKit [ Not found ] Dica-Kit Rootkit [ Not found ] Dreams Rootkit [ Not found ] Duarawkz Rootkit [ Not found ] Enye LKM [ Not found ] Flea Linux Rootkit [ Not found ] Fu Rootkit [ Not found ] Fuck`it Rootkit [ Not found ] GasKit Rootkit [ Not found ] Heroin LKM [ Not found ] HjC Kit [ Not found ] ignoKit Rootkit [ Not found ] IntoXonia-NG Rootkit [ Not found ] Irix Rootkit [ Not found ] Jynx Rootkit [ Not found ] KBeast Rootkit [ Not found ] Kitko Rootkit [ Not found ] Knark Rootkit [ Not found ] ld-linuxv.so Rootkit [ Not found ] Li0n Worm [ Not found ] Lockit / LJK2 Rootkit [ Not found ] Mood-NT Rootkit [ Not found ] MRK Rootkit [ Not found ] Ni0 Rootkit [ Not found ] Ohhara Rootkit [ Not found ] Optic Kit (Tux) Worm [ Not found ] Oz Rootkit [ Not found ] Phalanx Rootkit [ Not found ] Phalanx2 Rootkit [ Not found ] Phalanx2 Rootkit (extended tests) [ Not found ] Portacelo Rootkit [ Not found ] R3dstorm Toolkit [ Not found ] RH-Sharpe's Rootkit [ Not found ] RSHA's Rootkit [ Not found ] Scalper Worm [ Not found ] Sebek LKM [ Not found ] Shutdown Rootkit [ Not found ] SHV4 Rootkit [ Not found ] SHV5 Rootkit [ Not found ] Sin Rootkit [ Not found ] Slapper Worm [ Not found ] Sneakin Rootkit [ Not found ] 'Spanish' Rootkit [ Not found ] Suckit Rootkit [ Not found ] Superkit Rootkit [ Not found ] TBD (Telnet BackDoor) [ Not found ] TeLeKiT Rootkit [ Not found ] T0rn Rootkit [ Not found ] trNkit Rootkit [ Not found ] Trojanit Kit [ Not found ] Tuxtendo Rootkit [ Not found ] URK Rootkit [ Not found ] Vampire Rootkit [ Not found ] VcKit Rootkit [ Not found ] Volc Rootkit [ Not found ] Xzibit Rootkit [ Not found ] zaRwT.KiT Rootkit [ Not found ] ZK Rootkit [ Not found ] [Press <ENTER> to continue] Performing additional rootkit checks Suckit Rookit additional checks [ OK ] Checking for possible rootkit files and directories [ None found ] Checking for possible rootkit strings [ None found ] Performing malware checks Checking running processes for suspicious files [ None found ] Checking for login backdoors [ None found ] Checking for suspicious directories [ None found ] Checking for sniffer log files [ None found ] Performing Linux specific checks Checking loaded kernel modules [ OK ] Checking kernel module names [ OK ] [Press <ENTER> to continue] Checking the network... Performing checks on the network ports Checking for backdoor ports [ None found ] Checking for hidden ports [ Skipped ] Performing checks on the network interfaces Checking for promiscuous interfaces [ None found ] Checking the local host... Performing system boot checks Checking for local host name [ Found ] Checking for system startup files [ Found ] Checking system startup files for malware [ None found ] Performing group and account checks Checking for passwd file [ Found ] Checking for root equivalent (UID 0) accounts [ None found ] Checking for passwordless accounts [ None found ] Checking for passwd file changes [ Warning ] Checking for group file changes [ Warning ] Checking root account shell history files [ None found ] Performing system configuration file checks Checking for SSH configuration file [ Not found ] Checking for running syslog daemon [ Found ] Checking for syslog configuration file [ Found ] Checking if syslog remote logging is allowed [ Not allowed ] Performing filesystem checks Checking /dev for suspicious file types [ Warning ] Checking for hidden files and directories [ Warning ] [Press <ENTER> to continue] System checks summary ===================== File properties checks... Required commands check failed Files checked: 137 Suspect files: 122 Rootkit checks... Rootkits checked : 291 Possible rootkits: 0 Applications checks... All checks skipped The system checks took: 5 minutes and 11 seconds All results have been written to the log file (/var/log/rkhunter.log)

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >