Search Results

Search found 5809 results on 233 pages for 'curl multi'.

Page 149/233 | < Previous Page | 145 146 147 148 149 150 151 152 153 154 155 156  | Next Page >

  • How do I read multiple lines from STDIN into a variable?

    - by The Wicked Flea
    I've been googling this question to no avail. I'm automating a build process here at work, and all I'm trying to do is get version numbers and a tiny description of the build which may be multi-line. The system this runs on is OSX 10.6.8. I've seen everything from using CAT to processing each line as necessary. I can't figure out what I should use and why. Attempts read -d '' versionNotes Results in garbled input if the user has to use the backspace key. Also there's no good way to terminate the input as ^D doesn't terminate and ^C just exits the process. read -d 'END' versionNotes Works... but still garbles the input if the backspace key is needed. while read versionNotes do echo " $versionNotes" >> "source/application.yml" done Doesn't properly end the input (because I'm too late to look up matching against an empty string).

    Read the article

  • Protected flash video (requires HAL) on Gentoo

    - by Mala
    I am unable to play "protected" flash video, such as Amazon Prime Instant Video. From what I've read and uncovered, this seems to be due to a lack of HAL being installed on my computer. Confirmation that it is required for protected video can be seen towards the beginning of http://helpx.adobe.com/x-productkb/multi/flash-player-11-problems-playing.html However, hal is not in the gentoo portage tree, and in any case has been deprecated and replaced by udev. How can I go about getting Amazon Prime Instant Video to work again? I was considering grabbing the source from http://www.freedesktop.org/wiki/Software/hal but the links there won't load, and trying to install it from old ebuilds or from overlays which claim to still support it (e.g. kde-sunset) result in a compilation error: In file included from addon-generic-backlight.c:38:0: /usr/include/glib-2.0/glib/gmain.h:21:2: error: #error "Only <glib.h> can be included directly." Has anyone else solved this issue?

    Read the article

  • How can I install another locale for Firefox?

    - by mfn
    I've installed FF 3.6.3 on a multi-user system with the German language; however I'ld like to have everything in English (interface, etc.) for my user without installing a separate version of FF in english. I've found the setting general.useragent.locale and an extesion, Quick locale switcher, but I don't actually understand where to get the locale from (en_US in my case, I guess). I found pointers directing me to the offical FTP release server with the hint to download the appropriate locale XPI, e.g. from http://releases.mozilla.org/pub/mozilla.org/firefox/releases/3.6.3/win32/en-US/ , however there's none.

    Read the article

  • download and process a file by ftp at set intervals, with error handling, rescheduling and status messages

    - by compound eye
    I want to download a data file from a remote ftp server to my machine at regular intervals. Once the file is downloaded I want to call another script which will process the file. My development machine is mac os x, the eventual deployment environment is linux. What's would be the stock standard way to automate this? I know I can use cron to schedule curl to download and to run a script that will process the downloaded file at regular intervals, and I know could write a slightly more complex script or an application that would do this and add error handling, rescheduling and sending status emails. But one of my requirements for this project is to write as little custom code as possible, instead I should try to use standard, tried and true existing tools, and if I do have to write code, to try and write the most straightforward code possible. The reason for this is the code will potentially be installed on a large number of machines, all of which will need to be tweaked, customised and maintained by different people, long after I am gone from the project, so the intention is to use well documented, well supported tools as much as possible. This seems such a common task, there must be tools and scripts all over the internet, written by people who have carefully considered everything that could possibly go wrong when you need to download and process a file from a remote server at regular intervals, with error handling, rescheduling and sending status messages. Is that what Expect is for? What would you recommend? (the system will be downloading weather prediction data every six hours, so that the system can prepare in the event of bad weather warnings)

    Read the article

  • Best Practice: Migrating Email Boxes (maildir format)

    - by GruffTech
    So here's the situation. I've got about 20,000 maildir email accounts chewing up a several hundred GB of space on our email server. Maildir by nature keeps thousands of tiny a** little files, instead of one .mbox file or the like... So i need to migrate all of these several millions of files from one server to the other, for both space and life-cycle reasons. the conventional methods i would use all work just fine. rsync is the option that comes immediately to mind, however i wanted to see if there are any other "better" options out there. Rsync not handling multi-threaded transfers in this situation sucks because it never actually gets up to speed and saturates my network connection, because of this the transfer from one server to another will take hours beyond hours, when it shouldn't really take more then one or two. I know this is highly opinionated and subjective and will therefore be marked community wiki.

    Read the article

  • How can I port forward over a VPN NAT?

    - by Charlie
    I have a multi-site VPN currently running with pfSense boxes and currently using OpenVPN. However I can change the OS and VPN type if need be. The main router has a 10.13.0.0/16 subnet and a series of public IPs For example, a branch has a 10.12.1.0/24 subnet How can I port forward NAT traffic on a public IP of the main router to a server behind the NAT of the second? So for instance port 95 on a public IP assigned to the main router forwards to 10.12.1.102 on the other router. Is this even possible? Currently my setup works great but only for intertnal traffic

    Read the article

  • Apache suddenly very slow on http and faster on https

    - by hsnm
    Background: I have Apache 2 running on ubuntu. There is a low usage on it and mostly being accessed for a web service URL from mobile apps. It was working fine until I installed SSL certificates. I now have both http and https. When I access the server using https, I get a fairly quick response (but probably not as fast as before). When I use http, it's so slow. What I tried: From this post: I curl localhost from the host and it takes some time, meaning there is no routing issue. The server runs on Amazon EC2 instance and is managed by me only. Also: I see that Apache once running, creates the maximum number of processes it is allowed to, which was not the case before. I lowered the MaxClients to 20 and I think I'm getting faster responses but it still takes over a minute and I always have MaxClients Apache processes. dmesg returns many [ 1953.655703] TCP: Possible SYN flooding on port 80. Sending cookies. When I netstat I get many entries with SYN_RECV. Possibly a DDoS attack? From EC2's monitoring diagrams I see a pattern of high "Maximum Network In (Bytes)" since 2 days ago. By the way the server is still being tested, the actual traffic is very low and not consistent. I tried to go with this solution to limit incoming connections using iptables, still no luck, but I'm trying. Question: What could be the problem? Is this a DDoS attack?

    Read the article

  • How to troubleshoot if a zip file is valid or if it is big file size to be unzipped ?

    - by mireille raad
    Hello , I am trying to unzip a file with the size of 2GB I am getting the following error : unzip CLTE_C_08.zip Archive: CLTE_C_08.zip End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. unzip: cannot find zipfile directory in one of CLTE_C_08.zip or CLTE_C_08.zip.zip, and cannot find CLTE_C_08.zip.ZIP, period. After some googling, some people say that this error is because the file is too big, others say because file is corrupt, others say that it could be a not unix archive. So my question , how to find out if file is valid archive file on my Centos and what is the command/trick to uncompress big files ( if any ) Thanks in advance :)

    Read the article

  • How can write a mod_rewrite rule to determine if the domain is not the main domain then change https:// to http://

    - by Oudin
    I've set up a WordPress multi-site with a wildcard ssl for example.com to access the admin area securely. However I'm also using domain mapping to map other domains to other sites e.g. alldogs.com to alldogs.example.com. The problem is when I'm trying to access the front end of a site from and admin for a mapped domain e.g. alldogs.com by clicking "Visit Site" the Link goes to https://alldogs.com because of the forced ssl applied to the admin area. Which produces a certificate warning since the certificate is for example.com and not alldogs.com. How can write a mod_rewrite rule to determine if the url/link clicked on is not the main domain e.g. example.com then change the https:// to http:// so the site can be accessed via port 80 and not generate a certificate warning for that mapped domains

    Read the article

  • Why is my email server in AT&T's blacklist?

    - by legoscia
    I just got this bounce message: <¦¦¦¦¦¦¦¦@att.net>: host scc-mailrelay.att.net[204.127.208.75] said: 521-88.208.246.34 blocked by sbc:blacklist.mailrelay.att.net. 521 DNSRBL: Blocked for abuse. See http://att.net/blocks (in reply to MAIL FROM command) So I'm trying to figure out why our server ended up on their blacklist. The web page link doesn't tell me why, as far as I can see. From a few multi-RBL tools I conclude that our IP is only on the collateral damage lists of uceprotect.net (you can be exempt from that with a paid subscription), and I dearly hope that AT&T doesn't use that. From the mail server logs I see that an email to another @att.net address went through two days ago without being blocked. Does anyone have any ideas how I can find out what went wrong?

    Read the article

  • mac & windows backup solutions - Offsite Backups

    - by Kristiaan
    Im looking for some advice on a system Im looking to impliment within our company, but so far I have not found an adequate solution too. I need to provide my users with a way to backup there laptops whilst in the office and if possible offsite as well, we have a mixture of Windows & Mac laptops so software should ideally be multi platform. This is the first time i am attempting to-do something like this as we normally charge the users with responsibility for their backups. I have ruled out most of the services like dropbox, sugarsync (unless one exists that does this) as whilst they does exactly what I want it does not give me any control over restoring / recovering data in the event of the user being unavailable, as it requires their account password to access data.

    Read the article

  • Reccomendation for tuning 100's of SQL Databases

    - by wayne
    I'm running several SQL servers, each running a few hundred multi-gig databases for customers. They are all setup homogeneously as far as the schemas are concerned, however customer usages of the data differ quite a lot from database to database. What would be the best way to auto-index/profile/tune this large amount of databases? As there are at least 600 or more catalogs I cant have someone manually profile, and index as required by each databases usage patterns. I'm currently running SQL 2005 but will be moving to 2008, so solutions that work with either are fine.

    Read the article

  • Sending a file to uCLinux

    - by Mike
    I have a board running uClinux, and I need to send a file to it, but I'm not sure how... I have a RS232 port, and Ethernet port with an IP address at my disposal, I can telnet to the board, but uCLinux doesn't have a built in FTP client. What sort of options do I have here for transferring files from my Windows 7 (or OpenSUSE) machine(s) to this board? EDIT I just found I have a TFTP server on it: # tftp BusyBox v0.60.5 (2012.07.09-14:05+0000) multi-call binary Usage: tftp [OPTION]... HOST [PORT] But I can't find any good information on how to use TFTP. And looking around google I'm seeing it's good for loading binary images, so I'm assume anything could be sent, but I'm not sure.

    Read the article

  • Best window manager for Linux for Virtual Desktop / Multimon

    - by mattcodes
    Previous used Ubuntu Gnome with Compiz but for my basic spec intel macbook (4 years old) its a little too heavyweight. So for now Im back on my macbook with os x, but now considering going back to Linux. Im looking for a window manager that has the following properties: 1) Supports virtual desktop (need 4 minimum) 2) Works well with multi monitors - can move an app with shortcut from one monitor to the other (on same virtual desktop) 3) Can remember window position (i.e. open vim on 2 monitor) - however must coerce everything back to first window when 2nd screen is unplugged 4) Keyboard shortcut friendly 5) Not too hard to install 6) Works well with minimum hardware such as integrated graphics Please suggest and share your experiences

    Read the article

  • touchscreen in dual monitors but not as primary

    - by bazzjedi
    Hi, I have been looking for information about using a second monitor but thought of using a touchscreen will a touchscreen monitor work in dual set up, but not set up as main monitor which is the question. Can't find any information on this only when others set it has main. only this piece If you mean do the touch screen features still work on the other machine, the answer is yes. Past that, you can even see some of the touch screen features on a non touch screen monitor (just not the multi touch features!) For example, on the taskbar, click (without releasing) on any icon and then drag the mouse up, and you will see that it does the same as using your finger and dragging up. link|flag answered Feb 5 '10 at 15:14 Wil 47.4k125 have hp 2510 main thinking of hp2310ti touchscreen as secondary, grafic card geforce gtx 295- windows 7 professional regards bazzjedi

    Read the article

  • How do I change the canvas size of a PNG with ImageMagick (GraphicsMagick)? (How to pad with transparency?)

    - by Pistos
    Alternatively: How do I take a non-square PNG and "fill out" the "rest" of the image with transparency so that the resulting square image has the original image centered in the square? ULTIMATELY, what I want is to take any image of any GM-supported format of any size, and create a scaled-down PNG (say, 40 pixels maximum for either dimension), with aspect ratio maintained, transparency-padded for non-square original images, AND with an already-prepared 40x40 PNG transparency mask applied. I already know how to scale down and keep aspect ratio; I already have the command for applying my composite. My only missing piece is square-alizing non-square images (padding with transparency). Single command preferred; multi-command chain acceptable. (edit) Extra info: Here's the composite command I'm using: gm composite -compose copyopacity mask.png source-and-target.png source-and-target.png where mask.png has white pixels for what I want to keep of source-and-target.png and transparent pixels for what I want to remove (and become transparent) of source-and-target.png.

    Read the article

  • Why domain.com appears as theplant.com when hosted on hostgator?

    - by silow
    I have a script that's supposed to detect the url of its caller website. If the caller is another website, it should give something like http://callersite.com. I'm using this line of php code (though I suspect this won't matter for sysadmins) gethostbyaddr($_SERVER['REMOTE_ADDR']) I'm testing with a caller site that's hosted on hostgator. What I'm noticing though is that I don't get callersite.com, I get something like 1a.12.12ab.static.theplanet.com. I don't know what theplanet.com is and why I'm not getting caller site.com. Also what do I need to do to really get the domain of the site making a call to my script? -- Thanks for the explanation. Some have advised I use $_SERVER['HTTP_REFERER'] but it's not what I'm after. My script acts as an API. Another website makes a curl request to it and gets an output and later on presents it to the user. So http referrer gives false since the caller site.com is making a direct call to me. So any hope?

    Read the article

  • Can I use the snipping tool to take a screenshot of the windows 8 start screen or modern apps?

    - by Journeyman Geek
    One of the tools I've found invaluable in answering questions on SU is the snipping tool. I may on occasion need to take screenshots of part of the start screen or 'modern' apps. I may not want to take a complete screenshot, and while I can use PrtSc and switch back into desktop to paste it, this is clunky if I need to document a multi-step process. Can I use the snipping tool on modern apps or the start screen? If not is there a configurable way to save a series of screenshots to a fixed folder, say when I press a combination of keys, so I can work, screenshot, then crop and annotate the folder of images?

    Read the article

  • How do I prevent 'net ads join' from doing DDNS update?

    - by genehack
    I'm using 'net ads join' to add Linux servers to an AD domain. The servers are multi-homed, with a public IP on eth1 and a non-routable private background network on eth0 (in the 172.20 space, used for netboots and installs and stuff -- no routing to the Internet on that network). When I 'net ads join', it appears that a DDNS entry is getting created for the 172.20 interface. How can I prevent this from happening? (FWIW, my powers at the AD level are very limited -- I can join servers and delete server records but that's about it...)

    Read the article

  • Visual Studio Development on Virtual Box, Boot Camp, or VMWare Fusion

    - by Eli
    I currently have a Mac, 2ghz and 2 gigs of ram, running OS X Leopard and Virtual Box with a Windows 7 Pro 32bit virtual machine. Performance on the virtual machine is fine for minor tasks but is very clunky while trying to multi-task or develop in Visual Studio 2008. What would be my best option for being able to use Visual Studio, keeping cost and time in mind? 1) Upgrade ram to 4 gigs ($100). Will this really improve my performance enough to use Visual Studio in a Windows 7 vm? Or am I just wasting time/money? 2) Reinstall/restore Windows 7 disk image as a Boot Camp partition. I assume this should improve my performance, yes? 3) Purchase VMWare fusion instead of VirtualBox. Does Fusion require less resources to run? I am open to any suggestions. Thanks in advance

    Read the article

  • Lightweight window manager for Linux for Virtual Desktop / Multimon

    - by mattcodes
    Previous used Ubuntu Gnome with Compiz but for my basic spec intel macbook (4 years old) its a little too heavyweight. So for now Im back on my macbook with os x, but now considering going back to Linux. Im looking for a window manager that has the following properties: Supports virtual desktop (need 4 minimum) Works well with multi monitors - can move an app with shortcut from one monitor to the other (on same virtual desktop) Can remember window position (i.e. open vim on 2 monitor) - however must coerce everything back to first screen when 2nd screen is unplugged Keyboard shortcut friendly Not too hard to install Works well with minimum hardware such as integrated graphics Please suggest and share your experiences

    Read the article

  • Use both OpenVPN & eth0 together

    - by shadyabhi
    I connect to a VPN using openVPN. Now, after the connection is established, all my traffic goes through tun0. My LAN gateway is 10.100.98.4... So, for apps to use my direct internet connnection I did sudo route add default gw 10.100.98.4 But, I cant use tun0 now. I know this because curl --interface tun0 google.com doesnt give me anything.. How do I go about using both connections simultaneously. How can I achieve that? ROUTING TABLES:- Without VPN running:- Destination Gateway Genmask Flags Metric Ref Use Iface 10.100.98.0 * 255.255.255.0 U 1 0 0 eth0 default 10.100.98.4 0.0.0.0 UG 0 0 0 eth0 With VPN:- Destination Gateway Genmask Flags Metric Ref Use Iface 10.10.0.1 10.10.54.230 255.255.255.255 UGH 0 0 0 tun0 10.10.54.230 * 255.255.255.255 UH 0 0 0 tun0 free-vpn.torvpn 10.100.98.4 255.255.255.255 UGH 0 0 0 eth0 10.100.98.0 * 255.255.255.0 U 1 0 0 eth0 default 10.10.54.230 0.0.0.0 UG 0 0 0 tun0 After the route command- Destination Gateway Genmask Flags Metric Ref Use Iface 10.10.0.1 10.10.54.230 255.255.255.255 UGH 0 0 0 tun0 10.10.54.230 * 255.255.255.255 UH 0 0 0 tun0 free-vpn.torvpn 10.100.98.4 255.255.255.255 UGH 0 0 0 eth0 10.100.98.0 * 255.255.255.0 U 1 0 0 eth0 default 10.100.98.4 0.0.0.0 UG 0 0 0 eth0 default 10.10.54.230 0.0.0.0 UG 0 0 0 tun0

    Read the article

  • Cisco 2900 series router - 3x 3g HWIC - Can you use the same subnet for each HWIC?

    - by Lance
    We host a site with a 2900 series router with 3x 3G-HWIC cards installed. It is hosted with telstra and plugs into our corporate WAN. The card authenticates against radius and advertises a route into the WAN for which subnet it routes for. We have always used the same advertised subnet on each. Telstra have advised us that this could be the cause of some drop out issues whereby some services will work for some people and not for others and are saying effectively that their system will only use one of these at a time even though we can see the interface is online and assigned a WAN IP address. Has anyone out there configured a multi HWIC setup before and if so are they using different subnets for each or the same?

    Read the article

  • Can I backup my IMAP Gmail account locally using only Alpine?

    - by BasicObject
    I recently discovered cli email clients and have fallen in love with their speed and simplicity. After playing around with mutt and alpine I decided I favor alpine. I am a Gmail IMAP user and have many years of emails that I'd like to store locally. Is there a more or less convenient way to retain IMAP functionality and backup only the emails that haven't been backed up already on a weekly basis? I have alpine setup with my Gmail with IMAP and it's working great. I'm just wondering if there is a way to make an offline backup or "archive" locally on my computer while retaining the multi device access that IMAP offers. I apologize if this has been asked before, I did search for it and did not find my answer. Thank you for reading.

    Read the article

  • Never getting a JSON response when running server-side PHP proxy script but I do with others

    - by Dohk
    I'm on PHP 5.3.4 and Apache 2.2 btw So I'm using (or trying to use) Simple PHP Proxy (Simple PHP Proxy) I enter a URL at his example page at SPP Example Page and it works fine, I see the JSON response and all the headers. However, when I copy the exact URL, only changing the URL to now have localhost, I get both empty headers and no JSON. Assuming that the script on his site is the same I downloaded, could this be due to a multitude of things or a setting in Apache and/or the PHP ini? So for example: benalman.com/code/projects/php-simple-proxy/ba-simple-proxy.php?url=http://github.com/&full_headers=1&full_status=1 That will get me a ton of info back Now changing to localhost http://localhost/ba-simple-proxy.php?url=http://github.com/&full_headers=1&full_status=1 {"headers":[],"status":{"url":"https:\/\/github.com\/","content_type":"text\/html","http_code":301,"header_size":194,"request_size":182,"filetime":-1,"ssl_verify_result":0,"redirect_count":1,"total_time":0.094,"namelookup_time":0,"connect_time":0.047,"pretransfer_time":0,"size_upload":0,"size_download":185,"speed_download":1968,"speed_upload":0,"download_content_length":185,"upload_content_length":0,"starttransfer_time":0,"redirect_time":0.047,"certinfo":[]},"contents":null} I even went basic and just used some curl and of course, empty objects being returned other than false for my content and the url I set in my JSON. Any help is deeply appreciated or any ideas.

    Read the article

< Previous Page | 145 146 147 148 149 150 151 152 153 154 155 156  | Next Page >