Search Results

Search found 2442 results on 98 pages for 'feed'.

Page 52/98 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Using VirtualBox/VMWare to deploy software to multi sites ?

    - by Sim
    Hi all, I'm currently evaluating the feasibility of using VirtualBox (or VMWare) to deploy the follow project to 10 sites Windows XP MSSQL 2005 Express Edition with Advanced Services JBoss to run 1 in-house software that mostly query master data (customers/products) and feed to other software Why I want to do this ? Because the IT staffs in my 10 sites are not capable enough and the steps taken to setup those "in-house project" are also complicated What are the cons I can forsee ? Need extra power to run that virtualbox instance The IT staffs won't be much knowledgeable 'bout how to install the stuffs Cost (license for VirtualBox in commercial environment as well as extra OS license) I really seek your inputs on the pro/con of this approach, or any links that I can read further Thanks a lot

    Read the article

  • Why are default spamassassin rules not being applied to emails we generate?

    - by Chance
    My company uses a standalone spam-assassin install to test marketing emails, however, mail originating from us does not seem to run the full gamut of test. For example, Spam assassin has a default rule that flags messages that contain the phrase Dear [Something], and it properly flags spam that I feed it.It does not, however, apply that same rule to in house email I send it. Is it possible that spam assassin has white-listed us somehow, perhaps because the mail originates in the same domain as the server or receiver? I believe most of the recent spamassassin questions have been mine, so thanks for bearing with me as I figure this out! Chance EDIT Details on our SA setup: We are piping the emails into the CL with spamc -R < test_email.eml Identical results testing as root or a user, no user_prefs file

    Read the article

  • How to get started setting up IP security cameras

    - by dave
    I have just finished renovating my house. As part of the job, I have cat6 cable run through the house, including two external plugs. All cables terminate in the same location. My rough plan is to plug two IP cameras to monitor the front and rear, run POE from a central router to the two external cameras, plug my PC into the same router and run magic software X. Any machine plugged into the router or wirelessly connection should then be able to get a live feed and alerts based on motion detection. That's the plan, but I'm not sure how possible it is. What hardware to get and what monitoring software to get. Has anyone does something similar? What were your experiences?

    Read the article

  • YSLow says certain CSS are not gzipped

    - by rhand
    YSlow keeps on telling me files like http://www.example.com/wp-content/plugins/q-and-a/css/q-a-plus.css?ver=1.0.6.2 are not gzipped while the gzip test tool at Feed the Bot mentions I am all good: Compressed? Yes Compression type gzip Page size (Bytes) 32,493 Compressed size (Bytes) -1 Saving (Bytes) 32,494 Compression % 100% I added this to my .htaccess: # Gzip <ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file .(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule> #Deflate <ifmodule mod_deflate.c> AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript </ifmodule> The header for the file mentioned states: CF-Cache-Status MISS CF-RAY 13945df90a9a0c1d-AMS Cache-Control public, max-age=2592000 Connection keep-alive Content-Encoding gzip Content-Type application/javascript Date Thu, 12 Jun 2014 07:34:38 GMT Expires Sat, 12 Jul 2014 07:34:38 GMT Last-Modified Thu, 21 Feb 2013 01:29:18 GMT Server cloudflare-nginx Transfer-Encoding chunked Vary Accept-Encoding Any ideas what I am missing here?

    Read the article

  • Using 4 monitors with 2 generic video cards... possible?

    - by Ikram
    I'm thinking of setting up 4 monitors in a grid, using two video cards, one card to feed each pair of monitors through the card's two DVI ports. The most important requirement for me is to have the grid of monitors act as one single huge screen. Is this scenario possible by using two generic cards like Radeon 4870s on a Windows 7 computer? (I've heard of Eyefinity, but 4870s don't have it) Another issue is that I only have one PCI-Express slot on my computer's motherboard, therefore I'll need to use one of the lesser PCI cards as the second video card. Will this pose problems?

    Read the article

  • hp laserjet 4200 madness

    - by C-dizzle
    I have an HP LaserJet 4200 printer that is acting really weird. I can print from internet just fine, no problem... but when I try to print from a document used in microsoft word, for example, I always get "Manual Feed Tray 1". I have racked my brain on this and cannot get it figured out. It's a networked printer and only an administrator account can make changes to it, I have also reset the printer back to factory settings, uninstalled and reinstalled onto the pc's and server it is connected to. Running out of hair since I'm pulling it all out... please help.

    Read the article

  • How do you persist installed software & configurations on an Amazon EC2 instance?

    - by Richard
    I've gotten a base Debian AMI up and running and now I need to know the best way to maintain it. I've ran the updates (aptitude update/upgrade) and installed/configured my software (Apache, Ruby, etc.) but if I reboot the instance or start a new one I'll have to do all this work over again. How do you persist these types of things over a reboot? Do you build a new AMI every time you adjust some tiny piece of the system? Or is there some way to feed it a script on startup that configures it in "real-time"? I know I could go all the way with a Reductive Labs Puppet style setup but that's a bit too much for my needs right now (1-2 servers). Any best practices on this? Update: I found a bit of information on using User-Data to run scripts at instance boot time.

    Read the article

  • Standard for feeding test data to a Nagios plugin?

    - by chiborg
    I'm developing a Nagios plugin in Perl (no Nagios::Plugin, just plain Perl). The error condition I'm checking for normally comes from a command output, called inside the plugin. However, it would be very inconvenient to create the error condition, so I'm looking for a way to feed test output to the plugin to see if it works correctly. The easiest way I found at the moment would be with a command line option to optionally read input from a file instead of calling the command. if($opt_f) { open(FILE, $opt_f); @output = <FILE>; close FILE; } else { @output = `my_command`; } Are there other, better ways to do this?

    Read the article

  • How do I start silverlight on netflix in full screen automatically?

    - by KronoS
    I know that there this great big button that I can click for full screen on the bottom left of the screen. What I'm doing though is launching the direct movie/tv episode in chrome from XBMC using XBMCflicks. It's like pasting the video html address in the address bar and going directly to the video feed. Here's my dilemma, I want a fully working, no need for a keyboard/mouse HTPC, which means I don't want to have to click on the 'full screen' button. Is there a way to run chrome and silverlight automatically in full screen mode when launched?

    Read the article

  • How to remove large number of files/folders in linux

    - by user1745713
    We are using hadoop to split a table into smaller files to feed to mahout, but in the process, we created a huge amount of _temporary logs. we have an nfs mount for the hadoop volume so we can use all the linux commands to delete folders files, but we just can't get them to be deleted, here's what I've tried so far: hadoop fs -rmr /.../_temporary : hangs for hours and does nothing on nfs mount: rmr -rf /.../_temporary :hangs for hours and does nothing find . -name '*.*' -type f -delete : same as above the folders look like this (38 of these folders inside _temporary): drwxr-xr-x 319324 user user 319322 Oct 24 12:12 _attempt_201310221525_0404_r_000000_0 the content of these are actually folders, not files. each one of those 319322 folders has exactly one file inside. not sure why the do the logging this way. Any help is appreciated.

    Read the article

  • plugging in a 3.3V 50pin laptop HDD to USB?

    - by barlop
    I have a 50pin laptop hard drive. 1.8" wide. This 50pin connector concerns me.. Even if I get an adaptor, How can I know which side of the connector takes the power? I don't want to plug it in the wrong way. And I don't have n adaptor.. Could people link me to adaptors too. but main question is, which side to plug it in when I get the adaptor. I want to be sure. I do not want to blow the hdd. For the 3.3V I have a plan. Connecting green and black and using the orange cable(3.3v) to feed power. I am not too worried about that bit. But as I said.. Main thing is I want to know which side is 3.3V hard drive is MK6006GAH

    Read the article

  • Recommendations: Good Network MFP Printer/Scanner

    - by Joeme
    Hi, We have a small office that is expanding. At the moment we have 1x HP J6424 MFP, shared using it's built in network port. It is now becoming a headache, we have daily problems with people not being able to print or scan, and jobs just sitting in the queue. Or the scanner not being detected. Sometimes people can print but not scan, sometimes scan but not print, sometimes a bit of both. We are also pretty much constantly printing or scanning, or trying to! I would like to get a laser MFP (mono is fine) which works well for scanning a printing over the network with multiple users. Althernativly any recommendations for network scanners (sheet feed and or duplex a bonus). Clients are Windows 7 and Mac. Thanks very much!

    Read the article

  • yum update with shared cache

    - by Sammitch
    We've got a big batch of RHEL6 machines that are due for patching, and for some reason the process here does not involve a local repo. I'm new here, I've asked why, ["it just didn't work"] and I don't have enough time to make it work before the window that's already scheduled. So the usual method is to install yum-downloadonly and run yum update --downloadonly --downloaddir=/mnt/cifs_share and then yum update /mnt/cifs_share/*.rpm which just does not look right to me since not all of these machines have the same set of installed packages. The method I tried today was mounting the share to /var/cache/yum/x86_64/6Server/rhel-x86_64-server-6/packages/ which worked, but then yum automatically deleted everything once it finished. I've looked over the yum man page, but I don't see any flag I can feed it to stop it from deleting everything, nor a flag like up2date's --tmpdir=/mnt/cifs_share. Can anyone out there help me kludge this together until I can get a local repository working?

    Read the article

  • Making audio CDs en mass - Linux based solutions?

    - by The Journeyman geek
    My mom's sings and gives away cds to people. Invariably it falls to me to have to burn cds for her, and burning 50-100 cds on a single drive is a pain. I DO have a handful of cd burners and a slightly geriatric old PIII 450. This is what i want to be able to do - either point an application at a folder of WAV or MP3s, say how many copies i need on CLI (since then i can SSH into the system and use it headless) feed 2 or more CD burners cds until its done, OR pop in a single CD into a master drive and have its contents duplicated to 2 or more burners. I'd rather have it running on linux, be command line based, and be as little work as possible - almost automatic short of telling it how many copies i want would be ideal. I'm sure i'll have people wondering about legality - My mom sings her own music, and its classical, and older than copyright law, so, that's a non issue. I just want a way to make this chore a little easier, short of telling my mom to do it herself.

    Read the article

  • How can I load one image over network to multiple computers on boot?

    - by user754730
    A few years ago I saw this in a company but I don't know how it was built. There was 1 Computer (I don't know if Windows Server or plain Windows 7 - the server) and 3 other computers (Windows 7 - the clients). As soon as the Windows 7 clients were started, they all started up the same image (Don't know if the same image file or just the same state) over network and were able to work on the computer. As soon as the machine was shutdown, all the changes made to the system were erased. How could I build a system like this so I have 1 image file which I keep up to date and then feed it to the other machines in my network? It would look this this basically:

    Read the article

  • how to stop powershell mangling command line options for program executed from shell?

    - by kem
    From the powershell prompt, when I try to run a program and feed it a command line option, powershell ends up mangling the option. Why does this happen? Is there any way to stop it besides enclosing the option in quotes? For example, from the powershell prompt: PS Microsoft.PowerShell.Core\FileSystem::\\mach\share .\myprog.exe -file=input.txt myprog.exe ends up getting two arguments: 1) -file=input 2) .txt I need to run it like: .\myprog.exe "-file=input.txt" or .\myprog.exe '-file=input.txt' to force it to be one argument. No other shell does this.

    Read the article

  • How can I insert the quoted price of gold from kitco.com into my excel spreadsheet?

    - by Frank Computer
    kitco.com provides a realtime price quote for gold and other metals. I have a spreadsheet which makes calculations based on the price of gold and would like for this realtime value to automatically be updated on my excel sheet. I tried 'get external data' from a website but that didn't work. any ideas? EDIT ADDED: Kitco has a gadget called KCAST which displays realtime quotes on the Windows taskbar. I tried capturing those values from the taskbar but that didn't work either. Maybe if Kitco provided an API or feed, it could be done?

    Read the article

  • Need recommendations for a hardy scanner that has a robust feeder tray

    - by JohnyD
    In the early days of our company all our information came in on paper and all of what we sold was on paper. Because of this we literally rent our an old bank vault to house the millions of sheets of paper that, some say, still contain relevant information. That being said, I'm looking into purchasing some hardware capable of scanning all these documents and converting them to pdf. Being new at this level of digitization I would like to ask for recommendations for accomplishing this task. Most of this material exists as separate bound studies/articles/etc. Someone would have to remove the bindings and be able to load many pages at a time and have the scanner feed them all through and convert them to a single pdf (single pdf per study/article/etc). If you have any recommendations I would very much appreciate hearing about them, thanks.

    Read the article

  • How do you pick what server setup you need?

    - by ed209
    I recently started receiving pubsub data feed from etsy. It averages around 250 notifications per minute. But obviously, when the USA wakes up that spikes quite heavily. I want to be able to deal with those spikes (about 3 per day) but the rest of day is fine. What's the best method of getting the right server configuration. My current approach is to keep upgrading until the server stops dying... next leap is: Processor: AMD Phenom II X6-1055T HEXA Core RAM: 4GB DDR2 SDRAM HD1: SATA Drive (7,200 rpm) (+500 GB 7200 RPM SATA hard drive) HD2: SATA Backup Drive (+500 GB SATA (7,200 rpm)) OS: Linux OS (+CentOS 5 64-bit) Bandwidth: 6000GB Monthly Transfer (3000 in + 3000 out) (+100M uplink port) What's the best approach to working out what sort of server setup you need?

    Read the article

  • rss downloader script

    - by The Digital Ninja
    I have a Synology NAS that is powered by linux at my house. I'm looking to set up a cron script to check a group of rss feeds and auto download new video podcasts to a shared folder. I can do most of the scripting, such as deleting files older than 3 weeks and the wget parts. But I'm not sure how to parse the rss feed and check dates to only grab the latest. I figured its best not to re-invent the wheel and surly someone out there has a command line rss downloader or some such script. Any ideas?

    Read the article

  • preloading RSS contents in thunderbird, before actually reading them

    - by Berry Tsakala
    i have thunderbird 3.x, and i'm subscribed to several RSS feeds. How can I tell thunderbird to load/download any new RSS items in the background? The usual behavior with RSS feeds is that it download the headrs, or few introductory lines from the contents, but only when i'm clicking a feed item it starts loading "for real". I really want to receive the feeds and not to wait for them to load, the same way i receive emails in any email client - all messages are fully downloaded at once. there could be several reasons, BTW. - e.g. if i have short connection time, i'd rather connect, sync everything at once, and read it later. - or if i have a slow wifi connection, it's annoying to wait for each and every message, but the computer is idle while reading.. thanks

    Read the article

  • FPS lags with new acer aspire 5755G

    - by Calvin
    The title is kind of self explanatory, as my new laptop lags and has FPS drops. For example my FPS in Starcraft 2 hovers around 20 and constantly drops to 1 with low settings when I know it should run smoothly in high settings. I've updated my Nvidia driver, and set the preferred global settings to the 'High-performance Nvidia processor'. Here are some screen shots. Screen Shot One - Screen Shot Two - Screen Shot Three I'm not sure how to fix this problem, any feed back would be nice!

    Read the article

  • Software for Company internal Website [closed]

    - by LordT
    hope this is the right stackexchange site to ask this: We've a group of webpages/services at work (SE Startup), ranging from SVN, trac, continous integration to link collections to a DMS. Nearly everything has an RSS Feed to get the info I need, with the exception of SVN. I'm looking for some kind of software that can integrate these well on a kind of start-page. The most recent changes, upcoming events etc should be clearly visible, as well as an option to search (the search will be provided from a different tool). A news area should be included as well. Currently, I'm pondering doing this with either wordpress or TWiki, although wordpress seems to be the simpler solution in terms of getting something good looking quickly. Authentication should be handled by HTTP-Basic Auth, which we already have in place and working well. I normally would consider Sharepoint a viable option for this, but we're exclusively mac and linux, I won't put up a windows server just for this.

    Read the article

  • Podcast online sync service

    - by Hannes de Jager
    I download and listen to podcasts on 2 different computers. I would like so somehow sync the metadata such that, when I subscribe to a feed it gets subscribed on my other computer also. When I've downloaded a podcast it gets marked as downloaded on the other machine also. Is there a software + webservice combination that will allow me to do this. That will save its doings to the cloud and update it also? My one machine is Windows and the other one Linux.

    Read the article

  • Share the DVB card on windows 7 [closed]

    - by Bashar Kernel
    I have 2 computers connected to a router and I have a DVB card in one of them. I want to use the one DVB card to feed both of them. I read about it and I know that I want to share the DVB adapter with the Internet Connection Sharing on the LAN network. But when I use the connection sharing, I lose my internet access I tried to use "Bridge Connection", but then I also lost my internet access too. Can any one tell me how to fix this problem? And how to view the channels (for example how to use the VLC)?

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >