Search Results

Search found 73216 results on 2929 pages for 'header file'.

Page 96/2929 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • Cordova - Scrolling with a fixed header and footer (ios)

    - by Samu Singh
    Using Cordova (phonegap) & bootstrap to make a mobile application, testing on IOS for now. Getting an issue with a header and footer bar that are fixed with scrollable content in the middle. When tapping to scroll, the header/footer bar moves down or up with the content but then snaps back to place as soon as the scrolling completes. If I use -webkit-overflow-scrolling: touch; it works as expected, but it makes it really awkward to scroll through the content, and if you scroll passed the end, it only scrolls the header or footer (with elastic overflow) until you stop for a second. here's my html for the header/footer bars: <div id="headerBar"> <div class="container-fluid" style="background-color: #1569C7"> <div class="row"> <div class="col-xs-3 text-left"> <button id="logoutButton" type="button" class="btn btn-default"> Log Out </button> <button type="button" class="btn btn-default" id="restoreQuestionFeedButton"> <span class="glyphicon glyphicon-chevron-left"></span> </button> </div> <div class="col-xs-6 text-center" style="height: 55px"> <strong id="usernameText"></strong> </div> <div class="col-xs-3 text-right"> <button id="oldCreatQuestionButton" type="button" class="btn btn-default"> <span class="glyphicon glyphicon-plus"></span> </button> </div> </div> </div> </div> <div id="footerBar"> <div class="container-fluid" style="padding: 0"> <div class="row text-center"> <button id="createQuestionButton" type="button" class="btn btn-default footerButton"> <span class="glyphicon glyphicon-plus"></span> <strong>Ask a new free question!</strong> </button> </div> </div> </div> And here is the related CSS: #headerBar { position: fixed; z-index: 100; top: 0; left: 0; width: 100%; background-color: #1569C7; } #footerBar { position: fixed; z-index: 100; bottom: 0; left: 0; width: 100%; background-color: #1569C7 !important; }

    Read the article

  • header() function in php : No any redirection!

    - by jasmine
    I have written a very very very simple!! script in php. header redirection not working. 1- encoding : UTF-8 without BOM 2- with adding ob_start() the problem is countiueing. What is wrong in my code; login.php: <?php session_start(); require_once("funcs.php"); db_connection(); $username = $_POST['username']; $password = $_POST['pwd']; $submit = $_POST['login']; if($submit){ if (!filled_out($_POST)) { echo "please fill all fields"; } else{ $query = "SELECT * FROM users WHERE username ='{$username}' AND password ='{$password}'"; $result = mysql_query($query); if(mysql_num_rows($result) == 1){ $found_user = mysql_fetch_array($result); $_SESSION['id'] = $found_user['id']; $_SESSION['username'] = $found_user['username']; $_SESSION['password'] = $found_user['password']; setcookie(session_name(), '', time()+86400, '/'); header("Location: tst.php"); } else{ echo "incorrect username or password"; } } } ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org /TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Untitled Document</title> </head> <body> <form id="form1" name="form1" method="post" action=""> <p> <label for="username">Username:</label> <input type="text" name="username" id="username" /> </p> <p> <label for="textfield">Password</label> <input type="password" name="pwd" id="pwd" /> </p> <p> <input name="login" type="submit" id="login" value="Log in" /> </p> </form> </body> </html> <?php db_disconnect();?> and tst.php: <?php session_start(); require_once("funcs.php"); if (!isset($_SESSION['id'])){ header("Location : login.php"); } ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Untitled Document</title> </head> <body> <table id="structure"> <tr> <td id="navigation">&nbsp; </td> <td id="page"> <?php echo "welcome"."". $_SESSION['username']; ?> </td> </tr> </table> </body> </html> <?php db_disconnect();?> wthit oppening tst.php directly, header() doesnot redirect to login.php

    Read the article

  • File copying utility like rsync with error handling like ddrescue, for data recovery from a hard drive with bad sectors or hardware failure

    - by purefusion
    I have a hard drive with either bad blocks or sectors that are failing to read due to potential mechanical issues, such as a bad disk head, bad motor, or some other issue that is causing the hard drive to read data excruciatingly slowly and with lots of read errors. I'm seeing an average of 50 KB/sec, with some reads dropping below 10 KB/sec, and frequently it gets stuck on a file or sector altogether, usually for quite a long time—from 2-10 minutes or more (when using rsync, before it times out). Speed seems to vary wildly, and it gets stuck on files a lot, and when it finally gets "unstuck" it only seems to last for a short burst before it gets stuck again. The drive is also very quiet with only an occasional sound of files copying (usually when it gets stuck/unstuck for a brief time, before getting stuck again). Thus, there are none of those evil sounds that are normally associated with HDD death. Someone suggested that the problems sounded like they might be caused by a misaligned disk head, which requires a lot of re-reads before it finally reads data with success. Sounds plausible, but I digress... Anyway, the problem with rsync is that it seems to have no decent error handling support. Obviously, it wasn't meant for use in recovering data from failing hard drives, but all the so-called "data recovery" utilities out there that are meant for such use usually focus on recovery of deleted files or messed up partitions, rather than copying files off dying hard drives. Deleted file recovery is not what I need, obviously, so perhaps you can understand my disappointment in not being able to find what I'm after yet. Naturally, this is where you'd probably say "You should use ddrescue!" Well, that's all fine and dandy, but I've already got most of the data backed up, so I just want to recover certain files. I'm not concerned with trying to recover a full partition block-by-block as ddrescue does. I am only interested in rescuing just specific files and directories. Ideally, what I'd like is some sort of cross between rsync and ddrescue: something that lets me specify source and destination as directories of normal files like rsync (rather than two full partitions as ddrescue requires), with a way to skip files with errors in an initial run, and then allows me to attempt recovery of those files with errors in a later run (with a slightly altered command, of course), perhaps even offering an option to specify the number of retry attempts ...just like how ddrescue works with blocks, only I want a utility that works with specific files/directories like rsync does. So am I daydreaming here, or does something out there exist that can do this? Or, maybe even a way to make rsync or ddrescue work in such a way? I'm really open to whatever solutions might work, so long as they let me choose which files I want to "rescue", and can skip files with errors in the initial run, and try/retry those errors again later. So far I've tried rsync with the following options, but it often gets stuck on a file for longer than the timeout, and ideally I'd just like it to move on to the next file and come back later to the files it gets stuck on. I don't think that's possible though. Anyway, here's what I've been using up till now: rsync -avP --stats --block-size=512 --timeout=600 /path/to/source/* /path/to/destination/

    Read the article

  • Problem When Compressing File using vb.net

    - by Amr Elnashar
    I have File Like "Sample.bak" and when I compress it to be "Sample.zip" I lose the file extension inside the zip file I meann when I open the compressed file I find "Sample" without any extension. I use this code : Dim name As String = Path.GetFileName(filePath).Replace(".Bak", "") Dim source() As Byte = System.IO.File.ReadAllBytes(filePath) Dim compressed() As Byte = ConvertToByteArray(source) System.IO.File.WriteAllBytes(destination & name & ".Bak" & ".zip", compressed) Or using this code : Public Sub cmdCompressFile(ByVal FileName As String) 'Stream object that reads file contents Dim streamObj As Stream = New StreamReader(FileName).BaseStream 'Allocate space in buffer according to the length of the file read Dim buffer(streamObj.Length) As Byte 'Fill buffer streamObj.Read(buffer, 0, buffer.Length) streamObj.Close() 'File Stream object used to change the extension of a file Dim compFile As System.IO.FileStream = File.Create(Path.ChangeExtension(FileName, "zip")) 'GZip object that compress the file Dim zipStreamObj As New GZipStream(compFile, CompressionMode.Compress) 'Write to the Stream object from the buffer zipStreamObj.Write(buffer, 0, buffer.Length) zipStreamObj.Close() End Sub please I need to compress the file without loosing file extension inside compressed file. thanks,

    Read the article

  • FTP Server upload and filesystem questions

    - by Alex
    I'm a photographer who mainly does event photography. A while ago I bought myself a Nikon WT-4 wireless transmitter, a small device which connects via USB to my Nikon D700 DSLR, and then establishes a WiFi connection to an existing WLAN. It can then upload any pictures I take via FTP to an FTP server somewhere in the network. On my laptop I then have a piece of software which will check a given folder on the disk regularly, this software is smart enough to look at the modified file timestamp, if this timestamp is less than 10 seconds ago, it will not attempt to import the folder and skip the file in this iteration of the import scan. The problem I've discovered seems to be inherent to the FTP protocol, as I have the same problem with Windows 7 built in IIS server, as I do with FileZilla FTP server. When the transmitter starts to upload a file, the FTP server will create a small 300-500 KB file with the correct filename on the disk, but then do nothing with the file until it has completely received the file via FTP. So it seems to create this small dummy file, and then buffer the remainder of the FTP upload until it's finished, and then dump the rest of the file into the dummy file making it the correct size. Problem is, these uploads take about 15-30 seconds depending on reception, but since the folder watch tool will already try to import any file older than 10 seconds, it will always try to import the small dummy files which obviously fails as they're not copmlete yet. Is there any way to 'disable' this behaviour? Ideally I would like my file only to show up once it's been completely uploaded. Or perhaps someone knows another FTP server application (it has to run on win7) which does not show this behaviour?

    Read the article

  • Open an excel file using COM and save it as .xml file

    - by chupinette
    Hi. Im trying the following code: <?php $workbook = "D:\b2\\test.XLS"; $sheet = "Sheet1"; #Instantiate the spreadsheet component. $ex = new COM("Excel.sheet") or Die ("Did not connect"); #Get the application name and version print "Application name:{$ex->Application->value}<BR>" ; print "Loaded version: {$ex->Application->version}<BR>"; #Open the workbook that we want to use. $wkb = $ex->application->Workbooks->Open($workbook) or Die ("Did not open"); #Create a copy of the workbook, so the original workbook will be preserved. $ex->Application->ActiveWorkbook->SaveAs("D:\b2\Ourtest.xml"); #$ex->Application->Visible = 1; #Uncomment to make Excel visible. #Optionally, save the modified workbook $ex->Application->ActiveWorkbook->SaveAs("D:\Ourtest.xml"); #Close all workbooks without questioning $ex->application->ActiveWorkbook->Close("False"); unset ($ex); ?> This actually works and creates the Ourtest.xml file. But im getting characters like: ÐÏࡱá þÿ þÿÿÿ I have tried with SaveAs("D:\Ourtest.pdf") and it says the file has been corrupted or incorrectly decoded. Can anyone help me please?Thanks

    Read the article

  • Replace delimited block of text in file with the contents of another file

    - by rmarimon
    I need to write a simple script to replace a block of text in a configuration file with the contents of another file. Let's assume with have the following simplified files: server.xml <?xml version='1.0' encoding='UTF-8'?> <Server port="8005" shutdown="SHUTDOWN"> <Service name="Catalina"> <Connector port="80" protocol="HTTP/1.1"/> <Engine name="Catalina" defaultHost="localhost"> <!-- BEGIN realm --> <sometags/> <sometags/> <!-- END realm --> <Host name="localhost" appBase="webapps"/> </Engine> </Service> </Server> realm.xml <Realm className="org.apache.catalina.realm.UserDatabaseRealm" resourceName="UserDatabase"/> I want to run a script and have realm.xml replace the contents between the <!-- BEGIN realm --> and <!-- END realm --> lines. If realm.xml changes then whenever the script is run again it will replace the lines again with the new contents of realm.xml. This is intended to be run in /etc/init.d/tomcat on startup of the service on multiple installations on which the realm is going to be different. I'm not so sure how can I do this simply with awk or sed.

    Read the article

  • PHP: What is an efficient way to parse a text file containing very long lines?

    - by Shaun
    I'm working on a parser in php which is designed to extract MySQL records out of a text file. A particular line might begin with a string corresponding to which table the records (rows) need to be inserted into, followed by the records themselves. The records are delimited by a backslash and the fields (columns) are separated by commas. For the sake of simplicity, let's assume that we have a table representing people in our database, with fields being First Name, Last Name, and Occupation. Thus, one line of the file might be as follows [People] = "\Han,Solo,Smuggler\Luke,Skywalker,Jedi..." Where the ellipses (...) could be additional people. One straightforward approach might be to use fgets() to extract a line from the file, and use preg_match() to extract the table name, records, and fields from that line. However, let's suppose that we have an awful lot of Star Wars characters to track. So many, in fact, that this line ends up being 200,000+ characters/bytes long. In such a case, taking the above approach to extract the database information seems a bit inefficient. You have to first read hundreds of thousands of characters into memory, then read back over those same characters to find regex matches. Is there a way, similar to the Java String next(String pattern) method of the Scanner class constructed using a file, that allows you to match patterns in-line while scanning through the file? The idea is that you don't have to scan through the same text twice (to read it from the file into a string, and then to match patterns) or store the text redundantly in memory (in both the file line string and the matched patterns). Would this even yield a significant increase in performance? It's hard to tell exactly what PHP or Java are doing behind the scenes.

    Read the article

  • XP, how to apply security to files, now have simple file sharing and can't access some files from other machines ?

    - by Jules
    For a month or two now I've been using simple file sharing, for several months before that I didn't, then before that I had simple file sharing tuned on. So at the moment I don't have a security tab (on files or folders) or sharing permissions settings there too. As an example, from another machine, I can access files from 2007 but not from the summer of last year in the same folder. I can access all files on that local machine. So I think I just need to re-apply security or permissions somehow? What should I do?

    Read the article

  • How can I edit local security policy from a batch file?

    - by Stephen Jennings
    I am trying to write a utility as a batch file that, among other things, adds a user to the "Deny logon locally" local security policy. This batch file will be used on hundreds of independent computers (not on a domain and aren't even on the same network). I assumed one of the following were my options, but perhaps there's one I haven't thought of. A command line utility similar to net.exe which can modify local security policy. A VBScript sample to do the same. Write my own using some WMI or Win32 calls. I'd rather not do this one if I don't have to.

    Read the article

  • Increasing file descriptor limit on Debian does not work! Help!

    - by Aco
    I am running Debian 6 and I am trying to increase the file descriptor limit but it does not want to work. This is what I have done: I edited /etc/sysctl.conf by adding fs.file-max = 64000 at the end and applied the changes using sysctl -p. I then edited /etc/security/limits.conf and added the following lines: * soft nofile 64000 and * hard nofile 64000. Now when I execute ulimit -Hn and ulimit -Sn I still see 1024. I rebooted the server and I still get the same result. What have I failed to do?

    Read the article

  • Plesk file permissions - Apache/PHP conflicting with user accounts.

    - by hfidgen
    Hiya, I'm building a Drupal site which performs various automatic disk operations using the apache user (id=40). The problem is that the site was set up on a subdomain belonging to user ID 10001 (ie my main FTP account) so the filesystem belongs to that user ID. So I keep getting errors like this: warning: move_uploaded_file() [function.move-uploaded-file]: SAFE MODE Restriction in effect. The script whose uid is 10001 is not allowed to access /var/www/vhosts/domain.com/httpdocs/sites/default/files/images/user owned by uid 48 in /var/www/vhosts/domain.com/httpdocs/includes/file.inc on line 579. I've tried changing the apache group in httpd.conf to apache:psacln, psacln being the default group for all web users but that's not helped. The situation now is: ..../files/images/ = 777 and chown = ftplogin:psacln ..../files/images/user = 775 and chown = apache:psacln ..../files/tmp = 777 and chown = ftplogin:psacln So apparently uid 40 and 10001 both have permissions to write to any of the 3 directories involved, but still can't. Am i missing something here? Can anyone help? Thanks!

    Read the article

  • Puppet Directory and File ownership ignored

    - by Phil Sturgeon
    Puppet seems to be lying to me, which is not very nice. I am trying to set some files and directories included in /vagrant/src to be 666 and 777, and set the ownership group to the correct Apache user (using the PuppetLabs Apache module). Output from Puppet says yes. [default] Running provisioner: Vagrant::Provisioners::Puppet... [default] Running Puppet with /tmp/vagrant-puppet/manifests/default.pp... stdin: is not a tty No LSB modules are available. warning: require is a metaparam; this value will inherit to all contained resources warning: notify is a metaparam; this value will inherit to all contained resources notice: /Stage[main]//File[/vagrant/src/addons/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/addons/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/addons/]/mode: mode changed '0755' to '0777' notice: /Stage[main]//Package[curl]/ensure: ensure changed 'purged' to 'present' notice: /Stage[main]//File[/vagrant/src/system/cms/config/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/config/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/config/]/mode: mode changed '0755' to '0777' notice: /Stage[main]//File[/vagrant/src/system/cms/config/config.php]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/config/config.php]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/cache/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/cache/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/cache/]/mode: mode changed '0755' to '0777' notice: /Stage[main]//File[/vagrant/src/uploads/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/uploads/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/uploads/]/mode: mode changed '0755' to '0777' notice: /Stage[main]/Apache/Service[httpd]/ensure: ensure changed 'stopped' to 'running' notice: /Stage[main]//File[/vagrant/src/assets/cache/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/assets/cache/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/assets/cache/]/mode: mode changed '0755' to '0777' notice: Finished catalog run in 2.29 seconds Output from ls -lah says no: $ ls -lah /vagrant/src/ total 36K drwxr-xr-x 1 vagrant vagrant 510 2012-07-03 00:11 . drwxr-xr-x 1 vagrant vagrant 340 2012-07-03 08:08 .. drwxr-xr-x 1 vagrant vagrant 136 2012-07-03 00:11 addons drwxr-xr-x 1 vagrant vagrant 102 2012-07-03 00:11 assets drwxr-xr-x 1 vagrant vagrant 510 2012-07-03 07:45 .git -rw-r--r-- 1 vagrant vagrant 1.3K 2012-07-03 00:11 .gitignore -rwxr-xr-x 1 vagrant vagrant 1.4K 2012-07-03 00:11 .htaccess -rwxr-xr-x 1 vagrant vagrant 8.8K 2012-07-03 00:11 index.php drwxr-xr-x 1 vagrant vagrant 442 2012-07-03 00:11 installer -rwxr-xr-x 1 vagrant vagrant 2.8K 2012-07-03 00:11 LICENSE -rw-r--r-- 1 vagrant vagrant 1.1K 2012-07-03 00:11 phpdoc.dist.xml -rw-r--r-- 1 vagrant vagrant 3.3K 2012-07-03 00:11 README.md drwxr-xr-x 1 vagrant vagrant 204 2012-07-03 00:11 system -rw-r--r-- 1 vagrant vagrant 42 2012-07-03 00:11 .travis.yml drwxr-xr-x 1 vagrant vagrant 102 2012-07-03 00:11 uploads Whats up with that? My entire config can be found here.

    Read the article

  • How to use iTunes USB File Transfer to copy files from PC to Apple iPad, e.g. PDF files for viewer a

    - by Chris W. Rea
    I'm interested in reading PDF-format ebooks on my Apple iPad. I have half a gig of PDFs I want to transfer to it, from my PC. I'm familiar already with loading EPUB-format titles through iBooks – unfortunately, iBooks doesn't read PDFs so I am looking at using a third-party application. I know many such third-party media viewer applications for the iPad support download from web or email, but that's a hassle. I've heard iTunes 9.1 added support for USB File Transfer, specifically for iPad devices. How does USB File Transfer work in iTunes, for transferring files from my PC to my iPad? Please provide example steps. Moderators: Please remember the FAQ's "except insofar as they interface with your computer." ;-)

    Read the article

  • HTTP Content-type header for cached files

    - by Brian
    Hello, Using Apache with mod_rewrite, when I load a .css or .js file and view the HTTP headers, the Content-type is only set correctly the first time I load it - subsequent refreshes are missing Content-type altogether and it's creating some problems for me. Specifically, gzip is not compressing these files. I can get around this by appending a random query string value to the end of each filename, eg. http://www.site.com/script.js?12345 However, I don't want to have to do that, since caching is good and all I want is for the Content-type to be present. I've tried using a RewriteRule to force the type but still didn't solve the problem. Any ideas? Thanks, Brian More Details: HTTP headers WITHOUT random query string value: http://localhost/script.js GET /script.js HTTP/1.1 Host: localhost User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: */* Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Referer: http://localhost/ Cookie: PHPSESSID=ke3p35v5qbus24che765p9jni5; If-Modified-Since: Thu, 29 Apr 2010 15:49:56 GMT If-None-Match: "3440e9-119ed-485621404f100" Cache-Control: max-age=0 HTTP/1.1 304 Not Modified Date: Thu, 29 Apr 2010 20:19:44 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 PHP/5.3.1 Connection: Keep-Alive Keep-Alive: timeout=5, max=100 Etag: "3440e9-119ed-485621404f100" Vary: Accept-Encoding X-Pad: avoid browser bug HTTP headers WITH random query string value: http://localhost/script.js?c947344de8278053f6edbb4365550b25 GET /script.js?c947344de8278053f6edbb4365550b25 HTTP/1.1 Host: localhost User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: */* Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Referer: http://localhost/ Cookie: PHPSESSID=ke3p35v5qbus24che765p9jni5; HTTP/1.1 200 OK Date: Thu, 29 Apr 2010 20:14:40 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 PHP/5.3.1 Last-Modified: Thu, 29 Apr 2010 15:49:56 GMT Etag: "3440e9-119ed-485621404f100" Accept-Ranges: bytes Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 24605 Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: application/javascript

    Read the article

  • Windows 2003 Server on a domain, XP client PCs on a workgroup - file share without authentication?

    - by Zach
    I have a windows 2003 server on a domain and client PCs running XP on a workgroup. I have created a file share on the server that should be accessible by the client PCs. I even set the security and sharing to 'Everyone' just to test. When I try to access the file share from any of the XP machines, I get an authentication prompt that displays asking for credentials, even though 'Everyone' has full control currently (just for testing purposes). Why is it asking to authenticate? I need it to where it doesn't ask to authenticate. I also made sure passwords were set on all XP machines since I found this could be one possible issue and they all were. Any ideas? Thanks!

    Read the article

  • What is the simplest and fastest way to transfer large file through a Windows network?

    - by Sake
    I have a Window Server 2000 machine running MS SQL Server that stores over 20GB of data. The database is backed-up every day to the second harddrive. I want to transfer those backup files to another computer to build another test server and for recovery practicing. (the backup never actually got restored for almost 5 years. Don't tell my boss about that!) I have trouble transfering that huge file through the network. I've tried plain network copy, apache download, and ftp. Any method I tried end up failing when the amount of data transfered reach 2GB. The last time that I successfully transfered the file, it was through a usb attached external harddrive. But I want to perform this task routinely and preferably automatically. Wonder what is the most pragmatic approach for this situation ?

    Read the article

  • How to disable Windows File Protection in Windows XP or 7 from Registry?

    - by SEARAS
    How to disable Windows File Protection in Windows 7 and/or XP from Registry? I want to automatically replace a driver with my created driver. I used PendingFileRenameOperations key in HKLM\System\CurrentControlSet\Control\Session Manager but i've found that it can ONLY be used for simple (not-system) files, because Windows File Protection disables it for system files (see this post). Now I need to temporarily disable WFP (and turn it on after changing driver). You can tell me another way to disable it. It can help me too. Thanks in advance! Any ideas?

    Read the article

  • Online File Sharing that acts just like LAN shared drives, etc.

    - by Dayton Brown
    Hi All, Have a small business client that wants to move their current file share to the web. Specs are as follows, 20 to 30 GB of space, file sizes are normal (nothing more than 50 to 100 mb) 3 users ideal solution would be exact same functionality as windows explorer. CHEAP!!! But not super cheap. I would like to keep it around $20 per user per month. I've explored a bunch of solutions, but they are all a bit on the complicated side. Thanks in advance for the recommendations.

    Read the article

  • Puzzled about PHP file permission and shared webhosting - what are some explanations?

    - by extrakun
    I have this issue with different web-hosting, particular upload scripts which can only upload to a folder only if it has 777 permission (which is risky). On the test server (on a different webhost), 755 works well. On another web-hosting, log files generated by PHP file functions cannot be write to some time, but other files are mysteriously unaffected (for instance, the log files for the entire week is 655, and they work well, but just today's log-file doesn't work unless it is set to 777). I am more of an application developer than a server backend expert, so these behaviours puzzle me to no end. Why are they happening? What can be done?

    Read the article

  • What naming pattern can I use for sequential file naming (photos) when only their relative sequence is known?

    - by Juhele
    I got some old scanned photos and I want to put them in correct order. Unfortunately, I have no possibility to find out the exact order, only relative one like: "hm, this photo was surely taken after this one" and organize them step-by-step by manually changing numbering again and again. Is there any program (best free or opensource), where could I interactively put the photo in correct order straightaway (maybe by changing the order by dragging with mouse) and finally apply some file renaming to keep the file order? thank you in advance PS: running Windows (XP and 7), but if you know something for linux, let me kno too, please

    Read the article

  • Serving protected files using Nginx's X-Accel-Redirect header

    - by andybak
    I'm trying to serve protected files using this directive in my nginx.conf: location /secure/ { internal; alias /home/ldr/webapps/nginx/app/secure/; } I'm passing in paths in the form: "/myfile.doc" and the file's path would be: /home/ldr/webapps/nginx/app/secure/myfile.doc I just get 404's when I access "http: //myserver/secure/myfile.doc" (space inserted after http to stop ServerFault converting it to a link) I've tried taking the trailing / off the location directive and that makes no difference. Two questions: How do I fix it! How can I debug problems like this myself? How can I get Nginx to report which path it's looking for? error.log shows nothing and access.log just tells me which url is being requested - this is the bit I already know! It's no fun trying things randomly without any feedback. Here's my entire nginx.conf: daemon off; worker_processes 2; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; server { listen 21534; server_name my.server.com; client_max_body_size 5m; location /media/ { alias /home/ldr/webapps/nginx/app/media/; } location / { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; fastcgi_pass unix:/home/ldr/webapps/nginx/app/myproject/django.sock; fastcgi_pass_header Authorization; fastcgi_hide_header X-Accel-Redirect; fastcgi_hide_header X-Sendfile; fastcgi_intercept_errors off; include fastcgi_params; } location /secure { internal; alias /home/ldr/webapps/nginx/app/secure/; } } } EDIT: I'm trying some of the suggestions here So I've tried: location /secure/ { internal; alias /home/ldr/webapps/nginx/app/; } both with and without the trailing slash on location. I've also tried moving this block before the "location /" directive. The page I linked to has ^~ after 'location' giving: location ^~ /secure/ { ...etc... Not sure what that signifies but it didn't work either!

    Read the article

  • Why is file sharing over internet still working, despite all firewall exceptions for filesharing being disabled?

    - by Triynko
    Every exception in my windows server firewall that starts with "File and Printer Sharing" is disabled (ordered by name, so that includes domain, public (active), and private profiles). The Network and Sharing Center's options for everything except password protected sharing are off. Why would I still be able to access a network share on that server via an address like "\\my.server.com\" over the internet? The firewall is on for all profiles and blocking incoming connections by default. A "netstat -an" command on the server reveals the share connection is occurring over port 445 (SMB). I restarted the client to ensure it was actually re-establishing a new connection successfully. Is the "Password protected sharing: On" option in Network and Sharing Center bypassing the firewall restrictions, or adding some other exception somewhere that I'm missing? EDIT: "Custom" rules are not the problem. It's the "built-in" rules for Terminal Services that was the problem. Can you believe port 445 (File Sharing Port) has to be wide open to the internet to use Terminal Services Licensing?)

    Read the article

  • Create text file named after a cell containing other cell data

    - by user143041
    I tried using the code below for the Excel program on my `Mac Mini using the OS X Version 10.7.2 and it keeps saying Error due to file name / path: (The Excel file I am creating is going to be a template with my formulas and macros installed which will be used over and over). Sub CreateFile() Do While Not IsEmpty(ActiveCell.Offset(0, 1)) MyFile = ActiveCell.Value & ".txt" fnum = FreeFile() Open MyFile For Output As fnum Print #fnum, ActiveCell.Offset(0, 1) & " " & ActiveCell.Offset(0, 2) Close #fnum ActiveCell.Offset(1, 0).Select Loop End Sub What Im trying to do: 1st Objective I would like to have the following data to be used to create a text file. A:A is what I need the name of the file to be. B:2 is the content I need in the text file. So, A2 - "repair-video-game-Glassboro-NJ-08028.txt" is the file name and B2 to be the content in the file. Next, A3 is the file name and B3 is the content for the file, etc. ONCE the content reads what is in cell A16 and B16 (length will vary), the file creation should stop, if not then I can delete the additional files created. This sheet will never change. Is there a way to establish the excel macro to always go to this sheet instead of have to select it with the mouse to identify the starting point? 2nd Objective I would like to have the following data to be used to create a text file. A:1 is what I need the name of the file to be. B:B is the content I want in the file. So, A2 - is the file name "geo-sitemap.xml" and B:B to be the content in the file (ignore the .xml file extension in the photo). ONCE the content cell reads what is in cell "B16" (length will vary), the file creation should stop, if not then I can adjust the cells that have need content (formulated content you see in the image is preset for 500 rows). This sheet will never change. Is there a way to establish the excel macro to always go to this sheet instead of have to select it with the mouse to identify the starting point? I can Provide the content in the cells that are filled in by excel formulas that are not not to be included in the .txt files. It is ok if it is not possible. I can delete the extra cells that are not populated (based on the data sheet). Please let me know if you need any more additional information or clarity and I will be happy to provide it.

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >