Search Results

Search found 51448 results on 2058 pages for 'log files'.

Page 83/2058 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • Situations that require protecting files against tampering when stored on a users computer

    - by Joel
    I'm making a 'Pokémon Storage System' with a Client/Server model and as part of that I was thinking of storing an inventory file on the users computer which I do not wish to be edited except by my program. An alternative to this would be to instead to store the inventory file on the server and control it's editing by sending commands to the server but I was wondering if there are any situations which require files to be stored on a users computer where editing would be undesirable and if so how do you protect the files? I was thinking AES with some sort of checksum?

    Read the article

  • Limit the size of a directory by deleting old files

    - by Sulliwane
    I have a IP cam which save its recordings in a specific directory named Camera1 in my Ubuntu Server 12.04. I would like to limit the size of this folder to 5 gigs, by deleting -say once a day- the oldest files. I first checked the quota program but it doesn't seem to allow the creation of new files and deleting of the old ones. So I think the best workaround would be to run a bash script ? But I have no idea how to write it... Thank you guys !

    Read the article

  • Sending files via HTTP to web service

    - by Serguei Fedorov
    I am bit frustrated at the lack of information about this online. Here is the issue: I am in charge of creating a iOS application which sends sound data back and forth between the server and the app. The Audio is in small files and thus does not need to be streamed over, but rather it can be sent. Right now, I am using a TCP server I wrote to handle applications like this. However, I want to keep the system as simple as possible and writing your own server and client sockets can get a bit complex and leaves room for crashes. Overall it slows down development because I need to account for packet structure and other things. My question is, can I write an ASPX or PHP web service that lets me pass the files back and forth through GET or POST?

    Read the article

  • Improve Log Exceptions

    - by Jaider
    I am planning to use log4net in a new web project. In my experience, I see how big the log table can get, also I notice that errors or exceptions are repeated. For instance, I just query a log table that have more than 132.000 records, and I using distinct and found that only 2.500 records are unique (~2%), the others (~98%) are just duplicates. so, I came up with this idea to improve logging. Having a couple of new columns: counter and updated_dt, that are updated every time try to insert same record. If want to track the user that cause the exception, need to create a user_log or log_user table, to map N-N relationship. Create this model may made the system slow and inefficient trying to compare all these long text... Here the trick, we should also has a hash column of binary of 16 or 32, that hash the message and the exception, and configure an index on it. We can use HASHBYTES to help us. I am not an expert in DB, but I think that will made the faster way to locate a similar record. And because hashing doesn't guarantee uniqueness, will help to locale those similar record much faster and later compare by message or exception directly to make sure that are unique. This is a theoretical/practical solution, but will it work or bring more complexity? what aspects I am leaving out or what other considerations need to have? the trigger will do the job of insert or update, but is the trigger the best way to do it?

    Read the article

  • Intermittent FTP login issues (Microsoft IIS FTP Service)

    - by JaggenSWE
    I've got a somewhat weird problem which I'm not sure how to troubleshoot. We have a FTP running on a Windows Server 2003 machine using the IIS FTP Service, this is for our clients and is configured with IP-restrictions. However, now ONE of the clients starts complaining that they can't log in to the server from time to time. This is just ONE of 10+ clients that have this issue, which makes me think it's a problem on their side. Just to be on the safe side I had a peek into the FTP logs and found something strange. Whenever succeed in loggin in this is what I can find in the logs: nnn.nnn.nnn.70, userxxx, 2012-06-11, 09:22:32, MSFTPSVC1, SERVERNAME, nnn.nn.nn.11, 0, 0, 0, 331, 0, [191747]USER, userxxx, -, nnn.nnn.nnn.70, userxxx, 2012-06-11, 09:22:32, MSFTPSVC1, SERVERNAME, nnn.nn.nn.11, 0, 0, 0, 230, 0, [191747]PASS, -, -, However, if the login fails I see the following events: nnn.nnn.nnn.70, userxxx, 2012-06-11, 09:16:33, MSFTPSVC1, SERVERNAME, nnn.nn.nn.11, 0, 0, 0, 331, 0, [191739]USER, userxxx, -, nnn.nnn.nnn.70, -, 2012-06-11, 09:16:33, MSFTPSVC1, SERVERNAME, nnn.nn.nn.11, 0, 0, 0, 530, 1326, [191739]PASS, -, -, When you look at the event where the clients sends the PASS in the successful login it seems to know that it is infact "userxxx" that is coupled to that PASS, but when it fails it seems to be lost since user in the PASS event is set to "-". Anyone have any ideas around this, any help would be appreciated. :) //JaggenSWE

    Read the article

  • Keeping files private on the internet (.htaccess password or software/php/wordpress password)

    - by jiewmeng
    I was asked a while ago to setup a server such that only authenticated users can access files. It was like a test server for clients to view WIP sites. More recently, I want to do something similar for some of my files. Tho they are not very confidential, I wish that I am the only one viewing it. I thought of doing the same, Create a robots.txt User-agent: * Disallow: / Setup some password protection, .htpasswd seems like a very ugly way to do it. It will prompt me even when I log into FTP. I wonder if software method like password protected posts in Wordpress will do the trick of locking out the public and hiding content from Search Engines? Or some self made PHP script will do the trick?

    Read the article

  • Ubuntu 12.04 graphics crashing when GVIM opens TEX files

    - by Pdp Molniya
    I am having a little problem everytime I open GVIM to edit *.TEX files.... the menus die, windows jiggle (maximize and minimize quickly) and I get a 'internal error ' crash report from ubuntu (12.04). It says the problem is at /usr/lib/unity/unity-panel-service. Any tips on how to solve this? It might be related to the Latex package of vim (also I get this message when I open gvim (with or without TEX files) on terminal: (gvim:5915): Gnome-WARNING **: Accessibility: failed to find module 'libgail-gnome' which is needed to make this application accessible (gvim:5915): GLib-GObject-WARNING **: cannot retrieve class for invalid (unclassed) type `' Issue is independant of theme I just checked... Thanks a lot for your help! Cheers, Pedro

    Read the article

  • Recovering user files with a Live CD

    - by user33617
    For some reason my bootup isn't working. I get an error akin to "Operating System Not Found". So I tried bootrepair, and that didn't work. So then I decided I would just save my personal files, wipe everything, and reinstall. Except when I go to the /home directory, my username folder isn't there, instead it goes to the Live CD's desktop and file folders. Is there some other error occurring? Is there a way to recover the files?

    Read the article

  • How to shrink Windows partition with unmovable files in dual boot installation

    - by Tim
    To install Ubuntu alongside Windows 7, I have to shrink Windows 7 partition C:. But due to some unmovable files, I cannot shrink as much as I plan by using Windows own shrinking tool. I guess many of you who have both OSes on the same hard drive must have similar experience. How to solve this problem? Any reference that can help is also appreciated! Thanks and regards! UPDATE: I have identified what unmovable file currently stop further shrinking: \ProgramData\Microsoft\Search\Data\Applications\Windows\Projects\SystemIndex\Indexer\CiFiles\00010015.wid::$DATA If I understand correctly, the file belongs to Windows Search. Can I set up somewhere in Windows system settings to temperately eliminate the file and similar ones (because there are many similar files under the same directory which I guess will also stand in the way of shrinking and unmovable by defrag)?

    Read the article

  • core.* files eating up server space (~50MB)

    - by skytreader
    I'm renting server space from someone and, upon logging in my control panel after quite sometime, noticed an abnormal spike (~50MB) in the disk usage. Upon investigating, I found a lot of core.* files scattered around my public_html directory. Each one is more than 5MB in size but no more than 6MB. The * part is all numbers (in programming regex, that should be core\.\d+). I downloaded one and checked the contents. There was a lot of balderdash characters (NUL mostly, but also a scattering of ETB, ETX, STX) but there's this block of readable text which says: This text is part of the internal format of your mail folder, and is not a real message. It is created automatically by the mail system software. If deleted, important folder data will be lost, and it will be re-created with the data reset to initial values. Pretty self-explanatory. A few blocks above the text are some more readable messages that look like logs but is sandwiched in between non printable characters. I've extracted some below. Scan not valid for mh mailboxes Bogus character 0x%x in news state Can't rewrite news state %.80s Error closing backup news state %.80s No state for newsgroup %.80s found Now, a few concerns: Am I under attack? The messages seem to be about my webmail but I don't use my personal webmail that much---only for a vanity email address and an inbox for an outdated comments system. However, lately, I seem to notice a spike in the spam for my vanity mail. (Note: the comments system is covered by a captcha but every now and then some get through. My vanity email has a spam filter but it isn't as good as I'd like). Next, if this is a feature, can I turn it off? Is it advisable to? I've only 150MB so you see why I'm fretting over a 50MB spike. Some final details: my only server-side scripts are in PHP. The directory which accumulated the most number of these core files is the one containing the Wordpress-managed subdomain of my site. I manage my server through CPanel. Lastly, I decided to delete this files and after some checking nothing seems amiss in my websites nor in my mail. They are indeed the ones responsible for the ~50MB spike as my disk space usage is back to expected.

    Read the article

  • Showing renames in hg log?

    - by Ryan Thompson
    I know that Mercurial can track renames of files, but how do I get it to show me renames instead of adds/removes when I do hg log? For instance, instead of: A bin/extract-csv-column.pl A bin/find-mirna-binding.pl A bin/xls2csv-separate-sheets.pl A lib/Text/CSV/Euclid.pm R src/extract-csv-column.pl R src/find-mirna-binding.pl R src/modules/Text/CSV/Euclid.pm R src/xls2csv-separate-sheets.pl I want some indication that four files have been moved. I think I read somewhere that the output is like this to preserve backward-compatibility with something-or-other, but I'm not worried about that.

    Read the article

  • Delete System Files containing string

    - by Fuzz Evans
    I am trying to write a batch file that will examine a given directory, read each file for a given string "Example" and then delete any files that contain the string. The files are also System Files so I don't know what the exact extension is or if that matters (maybe you can just omit a file type filter and have it read all files?). Some of the files will be locked from reading as well so it needs to handle access denial errors if that occurs, not sure how batch files handle that.

    Read the article

  • Ubuntu One pretends to sychronize files, but it doesn't

    - by Tom Brito
    I have my Ubuntu One account configured in both Ubuntu 11.10 and iOS (ipod-touch). The photos from the iOS were successfully uploaded, but in Ubuntu One, although it shows the "syncing" and "synchronized" marks over the icons, the files are not showing in the website (one.ubuntu.com). In short: My files are not showing in the Ubuntu One website, although the icons have the "uploaded" mark. Any idea what can be wrong here? obs1: Also, not sure if it's related, the icon-marks will show only when I open the Ubuntu One Control Panel. It shows the message "file was uploaded", but there's nothing online. obs2: The folder I'm trying to synchronize is 30mb size. And my connection is 8mbps.

    Read the article

  • Setting up SVN (subvsersion) to manage our companies files, how to exclude large files from being ve

    - by Roeland
    Me and two other guys recently started our own web development company. We each work from our homes and have decided we want to keep one central location for all of our files. These files include word documents, spreadsheets, client files, designs.. etc. Anything pertaining to our company. I have a pretty solid internet connection and a windows 2008 server box sitting at home so I set up a subversion repository. Our file repository will look something like this. Clients Company A Design (photoshop files, wireframes, concepts) Documents ( logins, quotes, proposals etc) Site Backups Company B Design Documents Site Backups Prospects Company C Company D Our Company Our Website Documents (contract, operating procudres) My question is in regards to design files. The photoshop files that my designer works with range in sizes from 10mb to 100mb. I don't think we need to keep these files version-ed as this would eat up space incredibly fast. How do I go about controlling which files get version-ed, and which files are just stored. What I am thinking is that all documents need to be version-ed, and any files other then that should not be. Any help would be appreciated, thanks! Edit I am also curious whether this is the way to go. I just like this system since it keeps version of all my documents and at the same time. Also essentially I will have 3 backups in 3 different locations (3 local copies) so no need for backing it up. I am unsure of how svn would perform as purely a huge file repository.

    Read the article

  • How to add reflection definition to read JSON files in web game

    - by user3728735
    I have a game which I deployed for desktop and Android. I can read JSON data and create my levels, but when it comes to reading JSON files from web app, I get an error that logs, "cannot read the json file". I researched a lot and I found out that I should add my JSON config class to configurations, so I added this line to gameName.gwt.xml, which is in core folder: <extend-configuration-property name="gdx.reflect.include" value="com.las.get.level.LevelConfig"/> But it did not work out. I have no idea where should I place this line or where I should change to make my web app work, so I can read JSON files.

    Read the article

  • Renaming hundreds of files at once for proper sorting

    - by Mew
    I have a ton of files, all named stuff like 1.jpg, 2.jpg, 3.jpg, and so on up to 1439.jpg. However, I have a problem with one of my projects and alphabetizing. It will usually go in the order 1.jpg, 10.jpg, 11.jpg and so on. What I need is some way (or a script) to name the files so they are in the format such as 00001.jpg all the way up to 01439.jpg. How would I be able to do this quickly and efficiently?

    Read the article

  • How to add reflection definition to read json files on web game

    - by user3728735
    I have a game which I deployed for desktop and android, I can read json data and create my levels, but the problem is, when it comes to reading json files from web app, I get an error that logs, cannot read the json file, I researched a lot and I found out that I should add my json config class to configurations, I added this line to gameName.gwt.xml, which is in core folder <extend-configuration-property name="gdx.reflect.include" value="com.las.get.level.LevelConfig"/> but it did not work out too, I have no idea where should I place this line, or where should I change to make my web app work, so I can read json files

    Read the article

  • How to copy files via terminal?

    - by Levan
    This might sound silly for some people but I'm new to Linux and don't know how to use it as good as other people, yes I rad about copying files with terminal but these examples will help me a lot. So here is what I want to do: Examples: I have a file in /home/levan/kdenlive untitelds.mpg and I want to copy this file to /media/sda3/SkyDrive and do not want to delete any thing in SkyDrive directory. I have a file in /media/sda3/SkyDrive untitelds.mpg and I want to copy this file to /home/levan/kdenlive and do not want to delete any thing in kdenlive directory I want to copy a folder from home directory to sda3 and do not want to delete any thing on sda3 directory and opposite I want to cut a folder/file and copy to other place without deleting files in that directory I cut it into.

    Read the article

  • Podcast site - Serve audio files with CDN

    - by Bobe
    I am managing a small podcast website hosted on a shared server. Currently there are only eight or nine episodes, each of which are about 50 MB, so bandwidth is not really an issue at the moment. However, looking forward, would it be feasible to use a "free" CDN like Cloudflare to serve the audio files? If so, how would I set this up? I took a quick look at it before, and it seems you have to have your whole site routed (is that the right term?) through the CDN rather than just specific files or filetypes. I'd like some clarification on this.

    Read the article

  • Working With Files in Dreamweaver

    Your web files have special needs because anytime you move or delete a file, it has an effect on other files that are linked to it. So, Dreamweaver also has file management tasks related to the web development environment. Dreamweaver keeps a record of all the links on your site. (Remember that the code for a photo is also a link.) Then when you move, Dreamweaver will ask if you want to update the other pages that link to that file. If you delete a file, Dreamweaver will warn you, if there are other pages using that file. Dreamweaver is actually a package of programs and one of them does the file management tasks.

    Read the article

  • Ubuntu automatic logout whenever I execute exe files

    - by KeepTrying
    I have a problem. Here's the thing. There were 4 partitions in my hard drive: One for ubuntu root folder One for ubuntu home folder One for general stuffs like music, movies... And the last one for SWAP To install Windows 7, I resized partitions and moving the order of partitions by using GParted. I moved all of the ext formatted partitions to the left, so that means the spare space would be at the right. And I formatted that spare space in NTFS and install windows 7. After successfully installing windows 7, I used LiveUSB to fix grub. I installed Boot Repair and, with just one click, now I can dual boot ubuntu and windows 7. But, the point, because of changing the order of partitions, especially the partition consisting of home folder, I couldn't log in the ubuntu. I used recovery mode and changed file /etc/passwd. Everything almost got back to normal except one thing. The windows apps that I installed via wine don't work anymore. I run them via accessing menu Applications/Wine/Programs but nothing loads. One more thing, when I double click on exe files to run them, ubuntu suddenly log outs. Thank you for reading my post, it's quite long and my English is fairly poor. I'd appreciate for anyone who reads it.

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >