Search Results

Search found 20029 results on 802 pages for 'directory permissions'.

Page 436/802 | < Previous Page | 432 433 434 435 436 437 438 439 440 441 442 443  | Next Page >

  • Where to store: User connection information?

    - by TomTom
    ;) I am writing a .NET application wher ethe user connects to a given server. ALl information within the application is stored in the server. But I want / need to store the following information for the user: The server he connected to last The username he used to connect last (and no, no password, never ever). Any idea where to store this best? the application config file is not sensible (user != admin, application.config is write protected for him). So, my options are: In the registry. 2 keys under my own subkey. In a sort of ini file, stored in the user's data directory (AppData). This would possibly also allow later expansion (into like saving more information, some of which may not fit into the registry). Anyone a tip? Other alternatives? I tend so far to go for the AppData directory with my own subfolder - simply because it is a nice preparation for later to keep like a local copy of configuration etc.

    Read the article

  • IIS + PHP + Page with lots of images = Intermittent 403 errors

    - by samJL
    I am using an up-to-date Server 2008 R2 Datacenter, running IIS 7.5 and PHP 5.3.6/FastCGI On PHP pages with lots of images (60+), some of the images fail to load It is not always the same images-- on each page refresh an image that worked previously may not load, while an image that did not now does Looking at the Net tab in Firebug reveals that the failing image requests are 403 errors All of the images are located on the server in question, and the images directory has the correct permissions I believe this problem is the result of a limit on requests All of my attempts at researching this problem point to maxConnections setting in IIS, yet mine is set at the highest/default of 4294967295 (maxBandwidth too) I am also running a ColdFusion site on the same IIS installation, and it does not suffer from 403's on pages with lots of images I am left thinking that there is another connection limit (in PHP or FastCGI?) overriding the IIS connection limit I don't see anything that looks like a request limit in the php.ini, what am I missing? Any help would be appreciated, thank you

    Read the article

  • Crontab + .sh + php

    - by Kristaps Karlsons
    Hi. I'm trying to call a shell script every 5 minutes, witch executes php file under root. # crontab -l */5 * * * * /home/regularuser/call.sh permissions: -rw-rw-rw- 1 root root 162 Jun 6 23:40 call.php -rwxr-xr-x 1 root root 66 Jun 7 01:20 call.sh call.sh contents: #!/bin/bash php -q /home/regularuser/call.php echo "request processed" My problem is that my php file doesn't get executed via crontab. However, if I call call.sh - everything works perfectly. I'm new to crontab and shell scripting, so any advice/resources are welcome.

    Read the article

  • Why does IIS refuse to serve ASP.NET content?

    - by Michael Haren
    My Windows Server 2003 Std server refuses to server ASP.NET content. It serves regular html just fine but anything .net, even a one line html file with an ASPX extention fails silently. Things I've tried: Nothing in the event log or IIS WWW logs when it fails. Fiddler shows no response I reinstalled .NET with C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727aspnet_regiis.exe -U C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727aspnet_regiis.exe -I I give obscenely high permissions on everything I can think of (full control, read, write, etc.) to all possibly relevant users (IUSER*, ASP.NET, etc.). I confirmed that ASP.Net v1 and v2 Web Service Extensions are "allowed" in IIS Confirmed that the Server Manager had IIS and ASP.Net roles enabled Again: this is the scenario: http://localhost/Test/Default.htm <-- Works great! http://localhost/Test/Default.aspx <-- Bombs silently with no message at all Any guidance will be much appreciated! Solution: I reinstalled per the instructions below and it works now. Thanks all!

    Read the article

  • File mkdirs() method not working in android/java

    - by Leif Andersen
    I've been pulling out my hair on this for a while now. The following method is supposed to download a file, and save it to the location specified on the hard drive. private static void saveImage(Context context, boolean backgroundUpdate, URL url, File file) { if (!Tools.checkNetworkState(context, backgroundUpdate)) return; // Get the image try { // Make the file file.getParentFile().mkdirs(); // Set up the connection URLConnection uCon = url.openConnection(); InputStream is = uCon.getInputStream(); BufferedInputStream bis = new BufferedInputStream(is); // Download the data ByteArrayBuffer baf = new ByteArrayBuffer(50); int current = 0; while ((current = bis.read()) != -1) { baf.append((byte) current); } // Write the bits to the file OutputStream os = new FileOutputStream(file); os.write(baf.toByteArray()); os.close(); } catch (Exception e) { // Any exception is probably a newtork faiilure, bail return; } } Also, if the file doesn't exist, it is supposed to make the directory for the file. (And if there is another file already in that spot, it should just not do anything). However, for some reason, the mkdirs() method never makes the directory. I've tried everything from explicit parentheses, to explicitly making the parent file class, and nothing seems to work. I'm fairly certain that the drive is writable, as it's only called after that has already been determined, also that is true after running through it while debugging. So the method fails because the parent directories aren't made. Can anyone tell me if there is anything wrong with the way I'm calling it? Also, if it helps, here is the source for the file I'm calling it in: https://github.com/LeifAndersen/NetCatch/blob/master/src/net/leifandersen/mobile/android/netcatch/services/RSSService.java Thank you

    Read the article

  • GPG error occurs while using "deb file:/local-path-to-repo ..." in /etc/apt/sources.list

    - by Chandler.Huang
    I need to install packages within non-internet connection environment. My plan is to download dist structure from Internet and then add file path to /etc/apt/sources.list. So I download related structure includes ubunt/dists/precise, precise-backports, precise-proposed, precise-security, precise-updates from a ftp mirror server. And then I remove original source and add the following to my /etc/apt/sources.list. deb file:path-to-local-ubuntu-directory/ precise main restricted multiverse universe deb-src file:path-to-local-ubuntu-directory/ precise main restricted multiverse universe Then I got GPG error as following after apt-get update. root@openstack:/~# apt-get update Ign file: precise InRelease Get:1 file: precise Release.gpg [198 B] Get:2 file: precise Release [50.1 kB] Ign file: precise Release Get:3 file: precise/main TranslationIndex [3,761 B] Get:4 file: precise/multiverse TranslationIndex [2,716 B] Get:5 file: precise/restricted TranslationIndex [2,636 B] Get:6 file: precise/universe TranslationIndex [2,965 B] Reading package lists... Done W: GPG error: file: precise Release: The following signatures were invalid: BADSIG 0976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> I had tried use the following steps after google but in vain. sudo apt-get clean cd /var/lib/apt sudo mv lists lists.old sudo mkdir -p lists/partial sudo apt-get update Is there any way to resolve this? And why this error occurs? Thanks a lot.

    Read the article

  • Mounting share over VPN

    - by user1337
    I have a CentOS 5 web server which currently mounts a NFS export on my Mac OS X 10.7 laptop. It works great, except over VPN I can't get it to mount at all. I tried SMBUp but haven't been able to get it working even locally. It doesn't look like there's an easy way to install netatalk for CentOS 5. Even still, I'm not sure if that's the best way to do it. I tried using a GUI SSH client that can "mount a FTP disk" and it would work, except the files require root access and there's no external root access and the client can't elevate permissions. The basic thing I need to do is have the server be able to read the files off of my laptop, connected via VPN. The files are frequently updated (every 5-20 seconds) so I don't want to manually do that via SSH. Which protocol can work with both platforms and easily handle the latency introduced by VPN (and potentially mobile broadband)? Thanks

    Read the article

  • Git add not working with .png files?

    - by D Lawson
    I have a dirty working tree, dirty because I made changes to source files and touched up some images. I was trying to add just the images to the index, so I ran this command: git add *.png But, this doesn't add the files. There were a few new image files that were added, but none of the ones that were modified/pre-existing were added. What gives? Edit: Here is some relevant terminal output $ git status # On branch master # # Changed but not updated: # (use "git add <file>..." to update what will be committed) # (use "git checkout -- <file>..." to discard changes in working directory) # # modified: src/main/java/net/plugins/analysis/FormMatcher.java # modified: src/main/resources/icons/doctor_edit_male.png # modified: src/main/resources/icons/doctor_female.png # # Untracked files: # (use "git add <file>..." to include in what will be committed) # # src/main/resources/icons/arrow_up.png # src/main/resources/icons/bullet_arrow_down.png # src/main/resources/icons/bullet_arrow_up.png no changes added to commit (use "git add" and/or "git commit -a") Then executed "git add *.png" (no output after command) Then: $ git status # On branch master # # Changes to be committed: # (use "git reset HEAD <file>..." to unstage) # # new file: src/main/resources/icons/arrow_up.png # new file: src/main/resources/icons/bullet_arrow_down.png # new file: src/main/resources/icons/bullet_arrow_up.png # # Changed but not updated: # (use "git add <file>..." to update what will be committed) # (use "git checkout -- <file>..." to discard changes in working directory) # # modified: src/main/java/net/plugins/analysis/FormMatcher.java # modified: src/main/resources/icons/doctor_edit_female.png # modified: src/main/resources/icons/doctor_edit_male.png

    Read the article

  • Why wouldn't an embedded silverlight control work in a page?

    - by rsteckly
    Hi, I have a silverlight application project in my solution. The other project is a web application project that has a .xap file in ClientBin. When I created the silverlight project, it asked if I wanted the asp.net application to host it (and I said yes). In the root directory, there is a test page for the silverlight control. That loads the control. In another directory, I insert the SAME asp markup to get the silverlight control to launch again. Nothing happens. Why would the silverlight launch on one page and not on the other? Can people help point me to documentation about dependencies that I might not know about? I've put a reference to Silverlight.js on the page as well. Here's the markup: <div id="silverlightControlHost"> <object data="data:application/x-silverlight-2," type="application/x-silverlight-2" width="100%" height="100%"> <param name="source" value="../ClientBin/Editor.xap"/> <param name="onError" value="onSilverlightError" /> <param name="background" value="white" /> <param name="minRuntimeVersion" value="3.0.40818.0" /> <param name="autoUpgrade" value="true" /> <a href="http://go.microsoft.com/fwlink/?LinkID=149156&v=3.0.40818.0" style="text-decoration:none"> <img src="http://go.microsoft.com/fwlink/?LinkId=108181" alt="Get Microsoft Silverlight" style="border-style:none"/> </a> </object><iframe id="_sl_historyFrame" style="visibility:hidden;height:0px;width:0px;border:0px"></iframe></div> </div>

    Read the article

  • ASP.NET Image Upload Parameter Not Valid. Exception

    - by pennylane
    Hi Guys, Im just trying to save a file to disk using a posted stream from jquery uploadify I'm also getting Parameter not valid. On adding to the error message so i can tell where it blew up in production im seeing it blow up on: var postedBitmap = new Bitmap(postedFileStream) any help would be most appreciated public string SaveImageFile(Stream postedFileStream, string fileDirectory, string fileName, int imageWidth, int imageHeight) { string result = ""; string fullFilePath = Path.Combine(fileDirectory, fileName); string exhelp = ""; if (!File.Exists(fullFilePath)) { try { using (var postedBitmap = new Bitmap(postedFileStream)) { exhelp += "got past bmp creation" + fullFilePath; using (var imageToSave = ImageHandler.ResizeImage(postedBitmap, imageWidth, imageHeight)) { exhelp += "got past resize"; if (!Directory.Exists(fileDirectory)) { Directory.CreateDirectory(fileDirectory); } result = "Success"; postedBitmap.Dispose(); imageToSave.Save(fullFilePath, GetImageFormatForFile(fileName)); } exhelp += "got past save"; } } catch (Exception ex) { result = "Save Image File Failed " + ex.Message + ex.StackTrace; Global.SendExceptionEmail("Save Image File Failed " + exhelp, ex); } } return result; }

    Read the article

  • Deal with update location for click-once.

    - by Assimilater
    I'm not sure how many people here are experts with visual studios, but I'd imagine a handful (not to raise expectations but to appeal to your egos :P). I'm working primarily in visual basic for now (though I hope to switch to c# in the near future and maybe a java or web app). Basically I'm trying to create an update feature that will work similarly to how common programs such as firefox or itunes update automatically. There is supposed to be provided functionality for this in what is called click once. I carry out the following procedures and get the following errors when trying to change the update url of my program to a password-protected ftp location. Go to project properties Go to publish click updates click browse click FTP Site Under Server put: web###.opentransfer.com Under Port: 21 Under Directory put: CMSOFT Passive mode is selected (which is what filezilla tells me the server is accessed with) Anonymous User is unselected and a username and password are typed in Push Ok Under Update location it shows: ftp://web###.opentransfer.com/CMSOFT I push Ok I see a message box titled Microsoft Visual Basic 2010 Express with an x icon Publish.UpdateUrl: The string must be a fully qualified URL or UNC path, for example "http://www.microsoft.com/myapplication" or "\server\myapplication". I've tried changing the directory to "CMSOFT/PQCM.exe" and the results are the same...hope this was descriptive enough.

    Read the article

  • Files copying between servers by creation time

    - by driftux
    My bash scripting knowledge is very weak that's why I'm asking help here. What is the most effective bash script according to performance to find and copy files from one LINUX server to another using specifications described below. I need to get a bash script which finds only new files created in server A in directories with name "Z" between interval from 0 to 10 minutes ago. Then transfer them to server B. I think it can be done by formatting a query and executing it for each founded new file "scp /X/Y.../Z/file root@hostname:/X/Y.../Z/" If script finds no such remote path on server B it will continue copying second file which directory exists. File should be copied with permissions, group, owner and creation time. X/Y... are various directories path. I want setup a cron job to execute this script every 10 minutes. So the performance is very important in this case. Thank you.

    Read the article

  • How do I set up two existing disks with identical contents as a single mirrored volume in Windows 7 without losing data?

    - by Software Monkey
    I have two data disks that were, heretofore, in a mobo RAID configuration in Windows 7. They are now separate AHCI disks, visible in Computer Management. How to I go about making them a single mirrored volume in Windows? Note: The data is backed up up on two other separate disks, but it's a fair amount of work to do a restore (over 120'000 files, and I have to reset permissions). Note2: Currently the two disks are identical, and I can use the content of either one for this.

    Read the article

  • .NET Application with SQL Server CE Database

    - by blu
    I just started using SQL Server CE 3.5 in my WinForms Application (C# in VS 2008 SP1). I've noticed a couple of interesting things I'd like some input on: 1. Copying of sdf file to bin My sdf file is located inside of an Infrastructure project that houses my repository implementations. When the application is first debugged the sdf was copied to debug\bin. This is where all future reads/writes operate. At some point when this is deployed the file will go into a data folder using Click Once, but during development where should I be putting this sdf? Is having it in the bin typical, or are there any other recommendations? 2. Updating sdf It appears that writing to the sdf file does not immediately update the database. I am using Linq-to-SQL and am calling SubmitChanges, but on read the values are not returned. However if I close the application and re-open it the added value is there. Is there an additional flush step I need to take? What is causing this, file locking, buffering, something else? Update 3. Unit Tests I have an MS test project, and the sdf file is not being copied to the correct output directory. I have the settings: Build Action: Content Copy to Output Directory: Copy Always The message is: System.Data.SqlServerCe.SqlCeException: The database file cannot be found. Check the path to the database. I appreciate any guidance on these questions, thanks. If there is a tutorial other than what is on MSDN that you know about that would be great too. Working with CE is proving to be a difficult task and I welcome any help I can find.

    Read the article

  • Win Server 2008 R2 - Mapped shared folder hanging?

    - by M-Tech
    I have recently built a windows 2008 server R2 machine. This is purely for file server purposes and is very much a basic build. All windows updates installed and part of domain. I have setup a shared folder on the C:Drive and added permissions for domain users as co-owners. The client machines run XP SP3 and are part of the domain also. We have a few servers running the same setups on a few of our sites but this one is particular crashes users machines (explorer.exe hangs for at least a few mins) when attempting to access the shared folder. I have turned off the option on the network card for power save aswell still no change. Any help with this is very much appreciated and i look forward to hearing from you ;)

    Read the article

  • When opening any file in excel, a 1 is added to ther name, and the default is to save a new copy…

    - by Chris
    Ok... I've searched a lot for this, but it's not an easy question to search for! When I open any files (xls, or xlsx) in Excel 2007, excel acts like it's a read only file, essentially creating a new file with the name plus a 1 on the end... Eg. I open NewDoc.xlsx Excel opens it as NewDoc1.xlsx and the save button brings up the save as dialogue in my default folder. Does anyone know how to set it back to allowing me to open, edit and save a document without having to browse to the original document and save over it!? My immediate thought was access permissions, but the file is in a network folder with my user given Full Control, I also tried creating a new file in that folder, and also on my local machine just in case - same result. To make it even stranger, if I browse to the original file using the save as dialogue, it will let me save over the original, without any further prompts.

    Read the article

  • Batch script to create home home directories from list of names

    - by Steven
    I'm trying to create a home directories with permissions from a text file. I can only get the batch file to run the first line. Can anyone tell me why? I initiate the scripts by running go.bat as administrator. go.bat @echo for /f %%a in (users1.txt) do call test.bat %%a test.bat @echo off m: cd \ mkdir %1 icacls %1 /grant %1:(OI)(CI)M cd %1 mkdir public icacls public /inheritance:d icacls public / All:(OI)(CI)(RD) icacls public /grant All:(OI)(CI)R mkdir private icacls private /inheritance:d icacls private /remove All cd \ users1.txt user1 user2 user3

    Read the article

  • Prioritize file sharing performance in Windows Server 2008

    - by cmbrnt
    I've got a server running Windows Server 2008, and use it mainly for sharing files throughout the domain from a number of disks. It's running on VMware ESXi 4.0, in case that matters. My problem is that when I log in to the server to check user permissions etc, the access speed the files on the remote disks almost grinds to a halt. I havn't been able to measure the speeds, but I would guess it slows down to about 100kB/s as soon as I log in. This is on a gigabit network and the problems are equal for all users, even the ones connected to the same switch as the server. I've assigned 2 GB RAM to the server, and reserved it 1,5Ghz processor power. I don't have to do anything special on the server for this halt to occur. How can I make sure file sharing is prioritized on the server, so no matter what applications I'm using it will always make sure file sharing works properly? Could this be a VMware issue?

    Read the article

  • Get-Mailbox not returning all mailboxes

    - by rotard
    I am trying to set up an exchange mailbox backup job with Vembu Storegrid and StoreGrid is unable to list the mailboxes for the client. While I was troubleshooting the issue, I did notice another thing: running the Get-Mailbox command on the mail server as the backup user only shows the mailbox for that account, while running Get-Mailbox as my admin account returns a list of what appears to be all the mailboxes. My service account is a member of "Administrators", "Domain Admins", and "Domain Users". What additional permissions might be required to list all mailboxes in the system?

    Read the article

  • Access denied for user 'diduser'@'localhost' to database 'diddata' (1044, 42000)

    - by Arlen Beiler
    I am trying to setup a MySQL server and when I went to create a second user it wouldn't give it permissions for the database. I can connect fine as long as I don't specify a database. Access denied for user 'user'@'localhost' to database 'diddata' The connection details are: { 'host' : 'localhost', 'user' : 'user', 'password' : 'password' , 'database': 'diddata' }; And to create the DB and table I did: CREATE DATABASE IF NOT exists diddata; CREATE USER 'user'@'localhost' IDENTIFIED BY 'password'; GRANT ALL ON user.* TO 'user'@'localhost'; Note that I've changed the username and password in this question. I've already checked the privileges in MySQL workbench and they are there.

    Read the article

  • "Device not ready" on a network share in Windows 7

    - by user60689
    I have two computers C1 and C2 runing Windows 7 and both of them are members in a domain. On C1 I have an USB hard-disk which I shared for the users U1 and U2 giving them Read-Only permissions on the entire drive. However, even if I can see and browse the hard-disk localy (IOW from C1), from the other computer (C2) where I'm logged with U1, trying to access the C1's shared device, the C2's Windows 7 throws an error saying "Device Not Ready". Why? How can I fix this? PS: Tried to un-share and re-share again. No luck.

    Read the article

  • Is NFS capable of preserving order of operations?

    - by JustJeff
    I have a diskless host 'A', that has a directory NFS mounted on server 'B'. A process on A writes to two files F1 and F2 in that directory, and a process on B monitors these files for changes. Assume that B polls for changes faster than A is expected to make them. Process A seeks the head of the files, writes data, and flushes. Process B seeks the head of the files and does reads. Are there any guarantees about how the order of the changes performed by A will be detected at B? Specifically, if A alternately writes to one file, and then the other, is it reasonable to expect that B will notice alternating changes to F1 and F2? Or could B conceivably detect a series of changes on F1 and then a series on F2? I know there are a lot of assumptions embedded in the question. For instance, I am virtually certain that, even operating on just one file, if A performs 100 operations on the file, B may see a smaller number of changes that give the same result, due to NFS caching some of the actions on A before they are communicated to B. And of course there would be issues with concurrent file access even if NFS weren't involved and both the reading and the writing process were running on the same real file system. The reason I'm even putting the question up here is that it seems like most of the time, the setup described above does detect the changes at B in the same order they are made at A, but that occasionally some events come through in transposed order. So, is it worth trying to make this work? Is there some way to tune NFS to make it work, perhaps cache settings or something? Or is fine-grained behavior like this just too much expect from NFS?

    Read the article

  • Manually Add the TortoiseSVN Registry information?

    - by Pete Michaud
    I recently installed TortoiseSVN on my Windows 7 64 bit computer. For reasons outside the scope of this question, the installer could not get appropriate permissions to add the keys that TSVN needs in the registry. I'd like to add those keys manually, with a reg file. I tried unzipping the .msi installer to see if the .reg file was there, but no luck. I looked around the net a little, but no luck. I looked in the source code, figuring there must be a file in there somewhere with a list of all the registry changes in one place, but I haven't found any such thing. How can I get a complete list of registry changes for a fresh TortoiseSVN installation?

    Read the article

  • Unable to login into CentOS

    - by Rendl
    I had setup a multinode cluster using CentOS with VMware yesterday. Today when I reboot the nodes I get the below error on startup. "there is a problem with the configuration server status 256 centOS" (/usr/libexec/gconf-sanity-check-2 ) I am unable to login as root or any user as the screen is frozen. The solutions online is to change the permissions for some tmp files. My problem is I am unable to access the terminal as I cannot login. Also on reboot I do not have any recovery options in CentOS. I only see command line GRUB. I am new to linux and Hadoop.Pls help asap.

    Read the article

  • Drupal 7 configuration error with Postgresql in Mac OS 10.6.5

    - by Sam
    I am trying to configure Drupal 7 with Postgres. At the database setup step, I get the following error. Warning: PDO::_construct(): [2002] No such file or directory (trying to connect via unix:///var/mysql/mysql.sock) in DatabaseConnection-_construct() (line 300 of /Users/shamod/Sites/drupal/7/includes/database/database.inc). In order for Drupal to work, and to continue with the installation process, you must resolve all issues reported below. For more help with configuring your database server, see the installation handbook. If you are unsure what any of this means you should probably contact your hosting provider. Failed to connect to your database server. The server reports the following message: SQLSTATE[HY000] [2002] No such file or directory. Is the database server running? Does the database exist, and have you entered the correct database name? Have you entered the correct username and password? Have you entered the correct database hostname? NOTE: I am trying to connect to Postgresql but it fails on var/mysql/mysql.sock error. I have setup the database connection string in settings.php for Postgresql. It still does not work. Any idea?

    Read the article

< Previous Page | 432 433 434 435 436 437 438 439 440 441 442 443  | Next Page >