Search Results

Search found 39195 results on 1568 pages for 'export to plain text'.

Page 597/1568 | < Previous Page | 593 594 595 596 597 598 599 600 601 602 603 604  | Next Page >

  • Recommend Online password manager [closed]

    - by Dmitriy Nagirnyak
    Possible Duplicate: online password manager with sharing capabilities Hi, I am looking for a good online password manager with the following requirements: Single click login from browser. Single click form saving from the browser. Not attached to a single PC. Offline version (so I can use it if there is no internet, for example plug USB and have last sync-ed data). Ability to store plain text (notes, for example). Should work on Windows, Linux and Mac. So far I have been happy with RoboForm, but its offline USB version is not available on Linux. Please recommend. Thanks, Dmitriy.

    Read the article

  • SSH with X11 forwarding to host where I don't have a home-dir

    - by Albert
    I am trying to ssh with X11 forwarding into a host where I don't have a home directory. Because of that, xauth fails and X11 doesn't seem to work. I tried to specify a home-directory in advance but I guess it doesn't export env-vars to the host. zeyer@demeter:~> HOME=/tmp ssh ares -XY Password: Warning: No xauth data; using fake authentication data for X11 forwarding. Last login: Mon Mar 28 11:52:57 2011 from demeter.matha.rwth-aachen.de Have a lot of fun... Could not chdir to home directory /home/zeyer: No such file or directory /usr/bin/xauth: error in locking authority file /home/zeyer/.Xauthority zeyer@ares:/> Is there any trick I can make the X11 forwarding work? I still have write access to /tmp. But I am not sure how to setup the xauth fake authentication data manually.

    Read the article

  • Dovecot Virtual Users and Users Domain Mapping

    - by Stojko
    I have successfully compiled, configured and ran Dovecot with virtual users feature. Here's part of my /etc/dovecot.conf configuration file: mail_location = maildir:/home/%d/%n/Maildir auth default { mechanisms = plain login userdb passwd-file { args = /home/%d/etc/passwd } passdb passwd-file { args = /home/%d/etc/shadow } socket listen { master { path = /var/run/dovecot/auth-worker mode = 0600 } } } I faced one issue I can't resolve myself. Is there anyway to create users' domains mapping and provide username in mail_location? Examples: 1. currently I have /home/domain.com/user/Maildir 2. I'd like to have /home/USER/domain.com/user/Maildir Can I achieve this somehow? Greets, Stojko

    Read the article

  • convert .p7b key to a .pfx

    - by DrStalker
    I have an SSL certificate in .p7b format that I need to convert to .pfx. If I try this through the windows certificate managment the option to expert as a .pfx is disabled. Trying with openssl I have found the following two commands to do the conversion: openssl pkcs7 -print_certs -in certificate.p7b -out certificate.cer openssl pkcs12 -export -in certificate.cer -inkey privateKey.key -out certificate.pfx -certfile CACert.cer but I'm not sure what key to use for teh esecond command, or what certificate CACert.cer refers to. How can I convert this key to .pfx format?

    Read the article

  • Script / command to drop all connections / locks in Sybase SQL Anywhere 9?

    - by nxzr
    I've recently become responsible for administering an application which is essentially a front end to a Sybase SQL Anywhere 9 database, including the database itself. I'd like to use unload table to efficiently export the data for backup and, in the case of a few tables, ETL to get it into a reporting database / small scale data warehouse. The problem is that the client application crashes and leaves dead connections and shared locks on a pretty regular basis, which seems to prevent unload table from getting the (brief) exclusive locks it needs. Currently I use Sybase Central to verify that these connections are in fact zombies and drop them myself at the end of the day / week. Is there a command or script to drop all connections? Being able to drop everything at once after verifying that they're unneeded would be quite helpful but I haven't found a way to do it.

    Read the article

  • How do I enable in-place tablet pc input panel for non tablet PCs?

    - by yngvedh
    Hi all, Is it possible to enable the in-place tablet PC input panel on a non-tablet PC? I have checked the "For tablet pen input, show the Input Panel icon next to the text entry area when possible" checkbox in the options of the Input Panel. Does this not work because pen input is something different than mouse input? I do have a touch screen, but it just emulates a mouse (moving the cursor, pressing the left mouse button and such). I can get the Input Panel to show manually by starting tabtip.exe and then event the ink works, but I cannot get it to show (itself or it's in-place icon) when I activate text input controls. Does anyone know what's up?

    Read the article

  • Is there a way to use an inline PNG image in an Outlook e-mail?

    - by James McMahon
    In my work as a developer I sometimes find myself sending details emails with screenshots to illustrate some point or problem. The content of these screenshots is often text. So knowing that PNG is much better at handle compression of images with text, I save my screenshots as PNG and insert them into my email. However whenever I check my sent mail, the images are clearly being sent as a JPG because they look horribly compressed. I'm using Outlook 2003 as my email program. Is there some setting I can change to make Outlook send inline images as PNGs?

    Read the article

  • rm on a directory with millions of files

    - by BMDan
    Background: physical server, about two years old, 7200-RPM SATA drives connected to a 3Ware RAID card, ext3 FS mounted noatime and data=ordered, not under crazy load, kernel 2.6.18-92.1.22.el5, uptime 545 days. Directory doesn't contain any subdirectories, just millions of small (~100 byte) files, with some larger (a few KB) ones. We have a server that has gone a bit cuckoo over the course of the last few months, but we only noticed it the other day when it started being unable to write to a directory due to it containing too many files. Specifically, it started throwing this error in /var/log/messages: ext3_dx_add_entry: Directory index full! The disk in question has plenty of inodes remaining: Filesystem Inodes IUsed IFree IUse% Mounted on /dev/sda3 60719104 3465660 57253444 6% / So I'm guessing that means we hit the limit of how many entries can be in the directory file itself. No idea how many files that would be, but it can't be more, as you can see, than three million or so. Not that that's good, mind you! But that's part one of my question: exactly what is that upper limit? Is it tunable? Before I get yelled at--I want to tune it down; this enormous directory caused all sorts of issues. Anyway, we tracked down the issue in the code that was generating all of those files, and we've corrected it. Now I'm stuck with deleting the directory. A few options here: rm -rf (dir)I tried this first. I gave up and killed it after it had run for a day and a half without any discernible impact. unlink(2) on the directory: Definitely worth consideration, but the question is whether it'd be faster to delete the files inside the directory via fsck than to delete via unlink(2). That is, one way or another, I've got to mark those inodes as unused. This assumes, of course, that I can tell fsck not to drop entries to the files in /lost+found; otherwise, I've just moved my problem. In addition to all the other concerns, after reading about this a bit more, it turns out I'd probably have to call some internal FS functions, as none of the unlink(2) variants I can find would allow me to just blithely delete a directory with entries in it. Pooh. while [ true ]; do ls -Uf | head -n 10000 | xargs rm -f 2/dev/null; done ) This is actually the shortened version; the real one I'm running, which just adds some progress-reporting and a clean stop when we run out of files to delete, is: export i=0; time ( while [ true ]; do ls -Uf | head -n 3 | grep -qF '.png' || break; ls -Uf | head -n 10000 | xargs rm -f 2/dev/null; export i=$(($i+10000)); echo "$i..."; done ) This seems to be working rather well. As I write this, it's deleted 260,000 files in the past thirty minutes or so. Now, for the questions: As mentioned above, is the per-directory entry limit tunable? Why did it take "real 7m9.561s / user 0m0.001s / sys 0m0.001s" to delete a single file which was the first one in the list returned by "ls -U", and it took perhaps ten minutes to delete the first 10,000 entries with the command in #3, but now it's hauling along quite happily? For that matter, it deleted 260,000 in about thirty minutes, but it's now taken another fifteen minutes to delete 60,000 more. Why the huge swings in speed? Is there a better way to do this sort of thing? Not store millions of files in a directory; I know that's silly, and it wouldn't have happened on my watch. Googling the problem and looking through SF and SO offers a lot of variations on "find" that obviously have the wrong idea; it's not going to be faster than my approach for several self-evident reasons. But does the delete-via-fsck idea have any legs? Or something else entirely? I'm eager to hear out-of-the-box (or inside-the-not-well-known-box) thinking. Thanks for reading the small novel; feel free to ask questions and I'll be sure to respond. I'll also update the question with the final number of files and how long the delete script ran once I have that. Final script output!: 2970000... 2980000... 2990000... 3000000... 3010000... real 253m59.331s user 0m6.061s sys 5m4.019s So, three million files deleted in a bit over four hours.

    Read the article

  • Custom prompt doesn't work on Mac's terminal

    - by mareks
    I like to use a custom prompt (current path in blue) on my unix machine: export PS1='\[\e[0;34m\]\w \$\[\e[m\] ' But when I try to use it on Mac's terminal it doesn't work: it fails to detect the end of the prompt and overwrites the prompt when I type commands. This also happens when I'm inputting a long command where it wraps over the same line instead of starting a new line. I don't understand why this is the case since I use bash on both machines. Any suggestions on how to remedy this?

    Read the article

  • Mysteriously empty $_POST array

    - by Lex
    Hi all! I have the following HTML/PHP page: <?php if(empty($_SERVER['CONTENT_TYPE'])) { $type = "application/x-www-form-urlencoded"; $_SERVER['CONTENT_TYPE'] = $type; } echo "<pre>"; var_dump($_POST); var_dump(file_get_contents("php://input")); echo "</pre>"; ?> <form method="post" action="test.php"> <input type="text" name="test[1]" /> <input type="text" name="test[2]" /> <input type="text" name="test[3]" /> <input type="submit" name="action" value="Go" /> </form> As you can see, the form will submit and the expected output is a POST array with one array in it containing the filled in values and one entry "action" with the value "Go" (the button). However, no matter what values I enter in the fields; the result is always: array(2) { ["test"]=> string(0) "" ["action"]=> string(2) "Go" } string(16) "test=&action=Go&" Somehow, the array named test is emptied, the "action" variable does make it through. I've used the Live HTTP Headers extension for Firefox to check whether the POST fields get submitted, and they do. The relevant information from Live HTTP Headers (with a, b and c filled in as values in the textboxes): Content-Type: application/x-www-form-urlencoded Content-Length: 51 test%5B1%5D=a&test%5B2%5D=b&test%5B3%5D=c&action=Go Does anybody have any idea as to why this is happening? I'm freaking out on this one, it has cost me so much time already... EDIT: We've tried this on different servers, on Windows boxes it does work, on the Ubuntu server with PHP version 5.2.4 (with Suhosin), it doesn't. It even works on a different server, also with Ubuntu and the same PHP version, also with Suhosin installed.

    Read the article

  • Best way to Duplicate a Laptop's Hard Drive One-to-One

    - by Urda
    I have a Lenovo X61 Tablet computer, with a plain SATA drive inside. I have windows 7 and Ubuntu 9.10 dual booting on the computer. I want to back up both of these OS's, and their special partitions (Windows 7 has one, and of course the Linux Swap). I want a one-to-one backup, all of my mission critical data is already backed up, but I would like to get a snapshot, and store it on a larger file server at home for quick recovery. What is the best approach to do this?

    Read the article

  • How to make use of movie events imported by imovie 09 from a DVR

    - by overboming
    I have imported about 100 clips of video from my DVR using imovie 09 and they are all saved under movie events folder, this is ok. The problem is all the movie events files are not in standard formats, they are in some 'Apple Intermediate Format' only Quicktime and iMovie recoginize and play. My problem is I simply want to give my 100 clips of video to someone using PC and this intermediate format produced by imovie is not playable or convertable by anything I have found. Now the only option for me seems to be creating a project in iMovie and drag all the clips into the project and then export these 100 clips into a single standard file, but iMovie doesn't even let me conveninent do that, I can only click at some clip, select all, drag into the project, and repeat for 100 more times, Is there a alternative way I can do so? (or use Quicktime player to convert video formats one by one). Thanks

    Read the article

  • Scan to email messages downloaded by POP connector on SBS 2008 are delivered to badmail folder.

    - by Jon
    I have a Kyocera MFD. I have it set to scan to email using the ISP's SMTP server. The messages are delivered and received by the hosted email server. The POP connector on the SBS server downloads the messages, but delivers them to the badmail folder. I have given the message a subject and a body. Spam score from message header: MIME-Version: 1.0 X-Mailer: NetWorkScanner Mail System Version 1.1 Content-Type: multipart/mixed; boundary="------------V2VkLCAxMCBNYXIgMjAxMCAxMjoxOToxNSArMDAwMA==" X-Spam-Status: No, score=-2.6 X-Spam-Score: -25 X-Spam-Bar: -- X-Spam-Flag: NO --------------V2VkLCAxMCBNYXIgMjAxMCAxMjoxOToxNSArMDAwMA== Content-Type: text/plain; charset="us-ascii"

    Read the article

  • What is the best free software to learn touch-typing?

    - by gojira
    What is the best free software to learn touch-typing? Features it needs to have: should NOT display the keyboard layout on the screen! should give detailed statistics which actually measure progress (for which key do I have the highest error rate, graphs showing how typing speed improved over time, etc) should enable me to actually learn touch-typing in about one long weekend were I don't do much else than learning to touch-type. it would be very good if it were possible to load a text-file and the program will use words from the text-file for the exercises as well My goals are: at least same typing speed as I have now but which touch-typing, want to be able to look at the screen only when typing P.S.: EDIT: I forgot to mention, I'm using Win 7. And I know what the Dvorak and Colemak keyboard layouts are, but I'm not interested in them. My question was with respect to standard US keyboard layout.

    Read the article

  • Best way to migrate IIS6 from one server to another

    - by darko-romanov
    Hi, I need to move all my sites on a server with IIS 6 to another one, that has same OS (Windows Server 20003) and same IIS version. I'm trying to understand which is the best way to do it. Searching on Google I've found that there are at least 2 methods, one uses IIS Migration Tool, and another Web Deployment Tool. I don't know which method is best, it also seems that both methods can export one site at once, and I have about 100 sites hosted. What would you do?

    Read the article

  • PHP not working when accessed through a domain name, but works fine when accessed through IP

    - by Allister
    Done a basic setup of Ubuntu Server installing Apache, PHP and mysql through tasksel. When I browse to the IP address of the server it works fine and allows me to render PHP scripts fine. So I added a DNS entry for the server onto my local DNS server, calling it webdev.lazer.net When I go to this domain name through my browser it renders HTML documents fine but if I try to view PHP scripts it doesn't render and downloads in plain text to the browser (As if the PHP parser isn't rendering .php documents). I'm sure its some rookie mistake, but any help would be appreciated. Thanks

    Read the article

  • variables in batch files

    - by richzilla
    Hi All, Im trying to set up a batch file to automatically deploy a php app to a web server, and basically what i want is an entirely automated process where i would just give it a revision number from the repository and it would then export the files, upload via ftp and then update deployment info at the repo host (codebase). However, im starting from scratch here, and im wanting to know, how would i set up a batch file to accept a variable when it was run, for example typing myfile.bat /revision 42 would enable me to deploy revision 42 to my server. if anyone can point me in the right direction id be very appreciated.

    Read the article

  • Keeping websites from knowing where I live

    - by D Connors
    This questions is related to issues and practicality, not security. I live in Brazil and, apparently, every single website I visit knows about it. Usually that's okay. But there are quite a few sites that don't make use of that information adequately. For instance: Bing keeps thinking that Brazilian pages are way more relevant to me than American ones (which they're not). google.com always redirects me to google.com.br Microsoft automatically sends me to horribly translated support pages in Portuguese (which would just be easier to read in English). These are just a few examples. Usually it's stuff I can live with (or work around), but some of them are just plain irritating. I have geolocation disabled in Firefox, so I guess they're either getting this information from my IP or from Windows itself (which I bought here). Is there a way to avoid this? Either tell them nothing or make them think I live somewhere else?

    Read the article

  • Open source CMS for a university department

    - by Greg Kuperberg
    I realize that this type of question gets asked over and over again. Nonetheless, I want to ask a more specific version. I'm in a university math department. Long ago our sysadmins (or just one at the time) switched to a web content management system. At the time, Zope looked like an informed choice. We have used Zope for years, but at least in my opinion, it has always been a controversial decision. At the time I didn't understand why it was so important to have a web CMS. Now I see that it certainly is important, but I don't know that it should be Zope. The good (even necessary) features of Zope for us are: It's free and Linux-based. It is a true CMS and not something else (e.g. wiki or blog) It lets you write HTML and scripts. What I really don't like about Zope is that the outcome of using it is all-or-nothing in a lot of ways. At least in convenient use, it ends up dividing the enterprise into superusers who can do everything, and lusers who can't do anything (except write their own home pages in plain HTML). It has a huge user manual, which end users won't have time to read. Somehow with the access permissions, the simple thing to do is to let a few admins access all of the source and data and that's it. Since this is a math department, the user base varies from real novices to people who understand computers reasonably well. But as it stands, any change that involves Zope has to go through the sysadmins. When the sysadmins are in a hurry, sometimes they will also just add plain HTML pages to the web site instead of using the Zope framework. It doesn't help matters that Zope is fairly disk-intensive and fairly hype-intensive. Not to dwell on Zope too much, but I am wondering what is the right web CMS for a mixed user base of terminal novices, quick studies, and experienced users. Some users might want intermediate permissions, e.g. read permission but not write permission, or permission to change some subset of the pages or see some subset of the database tables. Also it should be Linux-based and open source and a little bit scalable, and of course widely used and well-supported is a good idea. I might guess that the answer is Drupal just because that was the general answer before, but I don't know if it is the right type of CMS for this purpose. (But note that Python is a relatively popular language in a math department, among other reasons because Sage is based on Python.) I can see that I didn't completely define the question and that people are guessing what type of site it is. It is the UC Davis Math Department. The main structure of the site is not suitable for a wiki and it is also not the same thing as a course environment like Moodle. Rather, the site is mostly structured as a generic medium-small enterprise. Some components of the site could be a wiki, Moodle, LaTeX plugin, Request Tracker, etc. However, the main issue is not these components. The main issue is that it would be better to decentralize management of the site. Right now, everything that is in the Zope CMS has to go through the sysadmins. Every other user in the department either has to put in a request to them, or write their own web pages with no help from Zope. There are two main reasons for this: (1) Other people in the department don't have time to read the Zope manual. (2) It's a hassle to set up intermediate permissions in Zope. However, there are other people in the department who know how to write computer programs and use markup languages. I wouldn't want a solution that assumes that users either can't be trusted with much more than drag-and-drop, or that they are IT professionals who sleep with documentation manuals. I'm wondering if Plone/Zope still has this quality, since certainly Zope by itself does. But I also wonder sometimes if common-sense flexibility is unfashionable these days, and that things in general have be either mindlessly easy or incredibly powerful.

    Read the article

  • Mac OS X bash prompt bug?

    - by Memo
    I am trying to set my bash prompt to display the time and current directory in bold: export PS1="\[\e[1m\][\A] \w \$ \[\e[0m\]" This does apparently work, but when I use the command history (ctrl-r), after finding the command I was searching for and pressing enter, this line is not displayed correctly. Here is an example: [21:58] ~/Wyona/svn-repos/zwischengas $ (reverse-i-search)`ta': tail -F logs/log4j-cnode1.log becomes, after pressing enter: [21:58] ~/Wyona/svn-repos/zwischengas $ -F logs/log4j-cnode1.log Of course, this is not "really" a problem, since the command does work correctly, but it is still annoying. Does anybody know why this happens? And, more importantly, how to prevent/fix it? Thanks, Memo

    Read the article

  • How to recover a server from a tar file

    - by Mitch
    In moodle the LMS you can export courses, as a tar.gz, some one said they were going to give me such a thing. I was suprised by the 6 gb size. I was even more suprised when I extracted it, and found the root directory to be the root of the server. The person giving me the course instead of exporting must have just tarred the entire server!! How should I go about recovering this? Is there anyway to start this up in a virtual machine? I have a whole linux server, what to do? I could probably just hand pick the data files I need, but how to access a mysql database with out running mysql? I am so stumped!

    Read the article

  • How to edit screen grabbed video on Windows XP?

    - by tangens
    Problem I've used CamStudio to record a presentation (fullscreen, 1280x1024, 45 min, no audio) to a small number of AVI files (10 files with about 800 MB total). Now I want to remove the initial and trailer sequences where you see start and stop of CamStudio. I'd like to remove some pauses during the video, too. Question Could you recommend some programs for Windows XP that I can use for this task? The result should be a (small - at least not bigger than the original) video format that I can play back. I've no need to create a DVD etc. Already tried I already tried MAGIX Video Deluxe 16 (Trial Version), but it takes about 2 hours to just export 10 minutes of the video and produces about 2 GB of data for this.

    Read the article

  • Certain web pages are suddenly not rendering properly in FireFox

    - by LeopardSkinPillBoxHat
    I am using FireFox 3.6.3. I noticed in the last couple of days that several webpages which I visit regularly are not rendering properly. A lot of the text is overlapping with other text and it basically looks like the style sheet is completely screwed up. I have tried disabling all of my Add-Ons and it doesn't make a difference. When I use Coral IE Tab to render the pages using IE they display without any problems. The websites which are not rending properly for me are: The Age Google Reader One interesting thing I noticed is that if I modify the Google Reader URL to not use SSL (i.e. change https to http) it renders without any issues. However, The Age website is not using SSL, and that still doesn't render properly. I have also disabled my Proxy Server (I normally use one at work) but this doesn't make a difference either.

    Read the article

  • How to get Mac OS X Terminal.app and screen/vim scrolling to play nice?

    - by rustychains
    OSX 10.6.3 Terminal.app Am a pretty dedicated screen user. Terminal.app line buffer and/or scroll does not seem to work for me. That is while in screen anything that goes past the top of the frame is gone, can't scroll back to. This seems to work ok in other terminal apps gnome, cygwin. Perhaps this is a shell env, config, or command? .screenrc: startup_message off autodetach on shell -$SHELL vbell off defutf8 on caption always caption string "%{= wk}%w" I have tried using defscrollback here with different values, but doesn't have an effect. some .bashrc settings: set -o physical export TERM=xterm-color shopt -s checkwinsize

    Read the article

  • Mac OS X Duplex Printing Paper Handling Oddness

    - by Christian Lindig
    I like to print on stationery with a pre-printed letterhead using the Preview.app and a duplex-capable HP PostScript (Color Laserjet 4700) printer. One would think that pre-printed stationery could be placed into one of the trays and then printed on front and reverse side. Unfortunately, the print dialog handles one and two-paged documents differently: the stationery needs to be placed differently into the tray if the document contains one page versus when it contains two pages. This is not obvious when printing on plain paper but becomes obvious once you mark, say, the upper left front corner of pages and then print different documents on them. I checked the PostScript code generated and indeed it is different for one versus two-page documents with respect to duplex printing, probably causing the difference in paper handling. Obviously this makes it difficult to print pre-printed stationery in duplex mode. I expected others to have stumbled upon this but could not find specific help so far. Any ideas? This is on OS X 10.6 and I checked two different printers.

    Read the article

< Previous Page | 593 594 595 596 597 598 599 600 601 602 603 604  | Next Page >