Search Results

Search found 6580 results on 264 pages for 'require'.

Page 105/264 | < Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >

  • Color Calibrate Dual Monitor XP SP2

    - by Laramie
    This topic has been touched on before but not really answered. I have a dual monitor system and the colors differ wildly. I currently live Buenos Aires where color correction hardware costs premium prices. I do some graphic design, but don't require a pro-level calibration. That said, I'd like my monitors to be set as close to "true color" as possible. I've located the useful and free Monitor Calibration Wizard, but it seems to adjust the entire system internally at startup. I could use the Microsoft Color Control Panel Applet to set a different ICC or ICM profile for each monitor, but the Monitor Calibration Wizard outputs its own format for profiles.

    Read the article

  • R: extracting "clean" UTF-8 text from a web page scraped with RCurl

    - by SlowLearner
    Using R, I am trying to scrape a web page save the text, which is in Japanese, to a file. Ultimately this needs to be scaled to tackle hundreds of pages on a daily basis. I already have a workable solution in Perl, but I am trying to migrate the script to R to reduce the cognitive load of switching between multiple languages. So far I am not succeeding. Related questions seem to be this one on saving csv files and this one on writing Hebrew to a HTML file. However, I haven't been successful in cobbling together a solution based on the answers there. The pages are from Yahoo! Japan Finance and my Perl code that looks like this. use strict; use HTML::Tree; use LWP::Simple; #use Encode; use utf8; binmode STDOUT, ":utf8"; my @arr_links = (); $arr_links[1] = "http://stocks.finance.yahoo.co.jp/stocks/detail/?code=7203"; $arr_links[2] = "http://stocks.finance.yahoo.co.jp/stocks/detail/?code=7201"; foreach my $link (@arr_links){ $link =~ s/"//gi; print("$link\n"); my $content = get($link); my $tree = HTML::Tree->new(); $tree->parse($content); my $bar = $tree->as_text; open OUTFILE, ">>:utf8", join("","c:/", substr($link, -4),"_perl.txt") || die; print OUTFILE $bar; } This Perl script produces a CSV file that looks like the screenshot below, with proper kanji and kana that can be mined and manipulated offline: My R code, such as it is, looks like the following. The R script is not an exact duplicate of the Perl solution just given, as it doesn't strip out the HTML and leave the text (this answer suggests an approach using R but it doesn't work for me in this case) and it doesn't have the loop and so on, but the intent is the same. require(RCurl) require(XML) links <- list() links[1] <- "http://stocks.finance.yahoo.co.jp/stocks/detail/?code=7203" links[2] <- "http://stocks.finance.yahoo.co.jp/stocks/detail/?code=7201" txt <- getURL(links, .encoding = "UTF-8") Encoding(txt) <- "bytes" write.table(txt, "c:/geturl_r.txt", quote = FALSE, row.names = FALSE, sep = "\t", fileEncoding = "UTF-8") This R script generates the output shown in the screenshot below. Basically rubbish. I assume that there is some combination of HTML, text and file encoding that will allow me to generate in R a similar result to that of the Perl solution but I cannot find it. The header of the HTML page I'm trying to scrape says the chartset is utf-8 and I have set the encoding in the getURL call and in the write.table function to utf-8, but this alone isn't enough. The question How can I scrape the above web page using R and save the text as CSV in "well-formed" Japanese text rather than something that looks like line noise? Edit: I have added a further screenshot to show what happens when I omit the Encoding step. I get what look like Unicode codes, but not the graphical representation of the characters. So it may be some kind of locale-related issue, but in the exact same locale the Perl script does provide useful output. So this is still puzzling.

    Read the article

  • Looking at desktop virtualization, but some users need 3D support. Is HP Remote Graphics a viable solution?

    - by Ryan Thompson
    My company is looking at desktop virtualization, and are planning to move all of the desktop compute resources into the server room or data center, and provide users with thin clients for access. In most cases, a simple VNC or Remote Desktop solution is adequate, but some users are running visualizations that require 3D capability--something that VNC and Remote Desktop cannot support. Rather than making an exception and providing desktop machines for these users, complicating out rollout and future operations, we are considering adding servers with GPUs, and using HP's Remote Graphics to provide access from the thin client. The demo version appears to work acceptably, but there is a bit of a learning curve, it's not clear how well it would work for multiple simultaneous sessions, and it's not clear if it would be a good solution to apply to non-3D sessions. If possible, as with the hardware, we want to deploy a single software solution instead of a mishmash. If anyone has had experience managing a large installation of HP Remote Graphics, I would appreciate any feedback you can provide.

    Read the article

  • web based source control management software [closed]

    - by tom smith
    hi. not sure if this is the right place, but hopefully someone might have thoughts on a solution/vendor. Starting to spec out a project that will require multiple (50-100) developers to be able to manipulate source files/scripts for a large scale project. The idea is to be able to have each app go through a dev/review/test process, where the users can select (or be assigned) the role they're going to have for the given app. I'm looking for web-based, version control, issue tracking, user roles/access, workflow functionality, etc... Ideally, the process will also allow for the reviewed/valid app to then be exported to a separate system for testing on the test server/environment. This can be hosted on our servers, or we can do the colo process. I've checked out Alassian/Collabnet, but any thoughts you can provide would me appreciated as well. thanks

    Read the article

  • Apache SSL for login and NON-SSL for everything else (.htacces)

    - by The Devil
    Hey I've almost figured it out on my own but there's something I'm missing. I want to set a couple of directories and files to require SSL and everything else that's not related to those files and dirs to point back to http. So far I have this: RewriteEngine on RewriteBase / # Force ssl for login & admin RewriteCond %{HTTPS} !on RewriteRule ^/?(admin(.*)|login\.php)$ https://%{SERVER_NAME}/$1 [R,NC,L] # Force non-ssl for others RewriteCond %{HTTPS} on RewriteRule ^/?(admin(.*)|login\.php)$ http://%{SERVER_NAME}/$1 [R,NC,L] I'm sure I'm doing something wrong but I just can't figure it out.... The first condition works perfect - whenever I access login.php or /admin/ it points to https. But the second one doesn't... Where have I mistaken ? Thanks in advance!

    Read the article

  • Adjusting color on one monitor in Nvidia Surround

    - by Chris Stauffer
    I'm currently running three 2560x1440 screens using Nvidia Surround. Two monitors are Yamakasi Catleaps (cheap Korean jobs) and the third is the Achievia Shimian (also Korean). The Catleaps have great color reproduction, however the Shimian is exceptionally blue tinted. With normal monitors the required correction would require minimal effort to accomplish. But these Korean monitors do not have hardware controls to do it. For those who are unfamiliar, Nvidia Surround basically takes all three monitors and makes one big "monitor" out of all of them (Xinerama for GNU/Linux folk), at a resolution of 7680x1440 in my situation. Therefore, adjusting the color profile in the Nvidia control panel changes the settings for ALL of the monitors simultaneously. Thus, I am looking for some software to adjust the Shimian (perhaps by just selecting the pixels that that monitor encompasses). Does anyone know of such a program?

    Read the article

  • Getting iTunes to play third party AAC files

    - by Redmastif
    I have a library filled with some old MP3 files and I'm in the process of changing them all to AAC for the better sound quality. Obviously I can't just create AAC versions of the files I already have because they would sound worse (lossy compression to converted to more lossy compression), so I'm going to their source and downloading them in a lossless form and using a third party to make them into AAC. Apparently iTunes will not handle AAC files that aren't made with iTunes. Is there a way around this? I've looked at third party programs and would be willing to use them, but since they all require the iTunes/iPod/iEverything driver, I don't know if they would still prevent my files or not. Also before you jump on my back about pirating, these files are from old CDs that I lost years ago. I paid for them. Thanks.

    Read the article

  • What ports to open for mail server?

    - by radman
    Hi, I have just finished setting up a Postfix mail server on a linux (ubuntu) platform. I have it sending and receiving email and it is not an open relay. It also supports secure smtp and imap. Now this is a pretty beginner question but should I be leaving port 25 open? (since secure smtp is preferred). if so then why? Also what about port 587? Also should I require any authentication on either of these ports? Please excuse my ignorance in this area :P

    Read the article

  • How to split audio into multiple channels from optical S/PDIF or 1/8"?

    - by Josh M.
    I have a motherboard which has an optical S/PDIF output or 1/8". I'd like to "split" that signal into the appropriate channels so that I can then connect that to the wires behind my car's headunit which, in turn, run to the amp. The factory Bose amp just takes a single connector with a million wires running out of it, so that's why I would need to separate the signal into separate channels. On the other end there are four RCA connectors: front left, front right, rear left, rear right. The sub-woofer signal does not require an additional connection. Edit: Revised to include S/PDIF or 1/8".

    Read the article

  • How does AuthzSVNAccessFile work?

    - by grigy
    I have set up an SVN repo with WebDAV access. For some reason it does not let checkout. Here is my httpd.conf part: <Location /svn> DAV svn SVNParentPath /home/svn/repositories AuthzSVNAccessFile /home/svn/dav_svn.authz Satisfy Any Require valid-user AuthType Basic AuthName "Subversion Repository" AuthUserFile /home/svn/dav_svn.passwd </Location> I have two repositories named "first" and "second" and the content of dav_svn.authz is: [first:/] doe = rw * = r [second:/] doe = rw grig = rw * = r When I'm trying to checkout the second with user doe, I get this in error_log: user doe: authentication failure for "/svn/second": Password Mismatch In order to understand what can be the problem I would like to better understand how the AuthzSVNAccessFile is supposed to work.

    Read the article

  • Use argdo with search pattern to delete line while suppressing errors and requiring confirmation in Vim

    - by richardh
    I use gVim 7.3.46 on Win 7. It is pretty straightforward to use argdo to search args files for a pattern and replace it while suppressing errors and requiring confirmation. :argdo %s/pattern/replace/gec | update However, I would like to delete entire lines that contain the pattern. I use the following. :argdo %/pattern/d | update But I can't suppress errors or require confirmation. Is there a way to do this? Thanks! (Also, is there a way to set "more" off? Thanks!)

    Read the article

  • Another Call to a member function get_segment() on a non-object question

    - by hogofwar
    I get the above error when calling this code: <? class Test1 extends Core { function home(){ ?> This is the INDEX of test1 <? } function test2(){ echo $this->uri->get_segment(1); //this is where the error comes from ?> This is the test2 of test1 testing URI <? } } ?> I get the error where commentated. This class extends this class: <?php class Core { public function start() { require("funk/funks/libraries/uri.php"); $this->uri = new uri(); require("funk/core/loader.php"); $this->load = new loader(); if($this->uri->get_segment(1) != "" and file_exists("funk/pages/".$this->uri->get_segment(1).".php")){ include("funk/pages/". $this->uri->get_segment(1).".php"); $var = $this->uri->get_segment(2); if ($var != ""){ $home= $this->uri->get_segment(1); $Index= new $home(); $Index->$var(); }else{ $home= $this->uri->get_segment(1); $Index = new $home(); $Index->home(); } }elseif($this->uri->get_segment(1) and ! file_exists("funk/pages/".$this->uri->get_segment(1).".php")){ echo "404 Error"; }else{ include("funk/pages/index.php"); $Index = new Index(); $Index->home(); //$this->Index->index(); echo "<!--This page was created with FunkyPHP!-->"; } } } ?> And here is the contents of uri.php: <?php class uri { private $server_path_info = ''; private $segment = array(); private $segments = 0; public function __construct() { $segment_temp = array(); $this->server_path_info = preg_replace("/\?/", "", $_SERVER["PATH_INFO"]); $segment_temp = explode("/", $this->server_path_info); foreach ($segment_temp as $key => $seg) { if (!preg_match("/([a-zA-Z0-9\.\_\-]+)/", $seg) || empty($seg)) unset($segment_temp[$key]); } foreach ($segment_temp as $k => $value) { $this->segment[] = $value; } unset($segment_temp); $this->segments = count($this->segment); } public function segment_exists($id = 0) { $id = (int)$id; if (isset($this->segment[$id])) return true; else return false; } public function get_segment($id = 0) { $id--; $id = (int)$id; if ($this->segment_exists($id) === true) return $this->segment[$id]; else return false; } } ?> i have asked a similar question to this before but the answer does not apply here. I have rewritten my code 3 times to KILL and Delimb this godforsaken error! but nooooooo....

    Read the article

  • Any problems with using a 301 redirect to force https traffic in IIS?

    - by Jess
    Is there any problem with using a 301 redirect to force all traffic to go to a secure-only site? We originally had redirect rules, but enforcing SSL-only seemed more secure. Here is how we set it up: Site 1: https://example.com/ Require SSL set Bound to 443 only Site 2: http://example.com Bound to 80 only Empty folder - no actual html or other data 301 Redirects to https://example.com This seems to work beautifully, but are there any issues with doing this? Would any browsers not recognize the 301 redirect, or could there be security warnings during the redirect?

    Read the article

  • FCoE, on any Ethernet switch?

    - by javano
    I understand the concept of FCoE. I have looked at the Wikipedia page and looking at the layer 2 frame diagram, it looks like FCoE really should "just work" on any Ethernet switch, but is this really the case? If so, what do switches like Cisco's Nexus 5k or 6120P offer that normal switches don't (in specific relation to FCoE)? I am just using those two switches as examples. On the Nexus 5548UP page for example it says the following; Unified ports that support traditional Ethernet, Fibre Channel (FC),and Fibre Channel over Ethernet (FCoE) Well if FCoE runnins over regular Ethernet, that why does it support "Ethernet and Fibire Channel over Ethernet"? This is why I am curious as to weather FCoE will run on any Ethernet switch and these switches just support "bonus" features, or if you do indeed require a specialist switch. Thank you.

    Read the article

  • Using nginx and/or varnish to cache server-generated 301 redirects

    - by rlotun
    I'm implementing a sort of url-shortener service. What happens is that I have some backend app server that takes in a request, does some computation and returns a 301 redirected url back upstream to an nginx frontend: request ---> nginx ----> app_server What I want to be able to do is cache this returned 301 url for the same request (a specific url with a "short code"). Does nginx do this caching automatically? Or should I drop in something like varnish in between nginx and the app_server? I can easily cache this in memcache, but that would require hitting the app_server, which I'm sure can be dispensed with after the first request. Thanks.

    Read the article

  • WSS "Cannot connect to the configuration database"

    - by Tim
    I have 64-bit WSS 3.0 installed on a 64-bit Windows 2003 Server. After installing WSS 3.0 I switched IIS to run in 32-bit emulation mode as we have some applications that require this. I'm getting a "Cannot connect to the configuration database" trying to get to the Central Admin page and wondered if: a. The setup I have won't work and I'm wasting my time trying to figure this out. or b. If anyone has any suggestions for resolving the database connection issue? The identity of the app pool that WSS runs under has all the required permissions in SQL so far as I can tell. Any help would be appreciated!

    Read the article

  • USB 3 adapter for a dell 2850 with PCI (or PCIX) ports

    - by Don Dickinson
    Does anyone know if there is a plain PCI (or PCIX) USB 3 adapter. i understand the bandwidth of PCI < USB3, but it still beats the heck out of USB 2. i have some older dell 2850s that do not have the PCI E ports that most USB 3 adapters require. i'd really like to get usb3 in those servers. i searched the internet but didn't see any. the local computer store said they only had pcie adapters. tia, don

    Read the article

  • Web Server Users - Best Practice

    - by Toby
    I was wondering what is considered best practice when several developers/administrators require access to the same web server. Should there be one non-root user with a secure username and password unqiue to the web server which everyone logs in as or should there be a username for each person. I am leaning towards a username for each person to aid in logging etc however then does the same user keep the same credentials over several servers, or should at least their password change depending on the server they are on? Should any non-root user of the system be added to the sudoers file or is it best practice to leave everyone off it and only let root perform certain tasks? Any help would be greatly appreciated.

    Read the article

  • Web Server Users - Best Practice

    - by Toby
    I was wondering what is considered best practice when several developers/administrators require access to the same web server. Should there be one non-root user with a secure username and password unqiue to the web server which everyone logs in as or should there be a username for each person. I am leaning towards a username for each person to aid in logging etc however then does the same user keep the same credentials over several servers, or should at least their password change depending on the server they are on? Should any non-root user of the system be added to the sudoers file or is it best practice to leave everyone off it and only let root perform certain tasks? Any help would be greatly appreciated.

    Read the article

  • Multiple Selections from Dropdown in Word 2011 Form

    - by BR Hudgens
    Anyone know if there is a way to setup a Word form to allow the user to select multiple items from a dropdown list that are then displayed? For example, I need to setup a form to identify applicability of specific procedures to various levels of our organization. Since the listing of all departments under all divisions can be pretty lengthy, I was hoping for the ability to check off a division and then have a dropdown list on the same line to select the applicable departments (so the selections are specific to that division and not all listed on the form if that division is not applicable). First, I'm not seeing how to create a dropdown that would allow multiple selections. Beyond that, I don't know if that form element alone would then display all selected items or if that would require another element of some type. Anyone have any suggestions on this? Thanks in advance!

    Read the article

  • What is the easiest way to do a direct file transfer of an extremely large file over the Internet?

    - by Kenneth Cochran
    I would like to transfer a 20+ GB file to a friend. I would like it to: Be fast Ensure data integrity Not require opening ports in either end's firewall Be free Not broadcast the file's existence to everyone on the Internet I've looked a several technologies and nothing seems to fit: Gnutella, BitTorrent, et al. satisfies 1, 2 and 4 JetBytes... 1, 3, 4 and 5 Yahoo Messenger, AIM, etc. 3, 4 and 5 FTP, SFTP... 1?, 4 and 5 rsync... 1, 2, 4 and 5 For a file this size speed and data integrity are the most important. No one wants a 20 GB file to fail a MD5 check after spending two days downloading it. Is there anything that meets all these requirements?

    Read the article

  • Why don't I just build the whole web app in Javascript and Javascript HTML Templates?

    - by viatropos
    I'm getting to the point on an app where I need to start caching things, and it got me thinking... In some parts of the app, I render table rows (jqGrid, slickgrid, etc.) or fancy div rows (like in the New Twitter) by grabbing pure JSON and running it through something like Mustache, jquery.tmpl, etc. In other parts of the app, I just render the info in pure HTML (server-side HAML templates), and if there's searching/paginating, I just go to a new URL and load a new HTML page. Now the problem is in caching and maintainability. On one hand I'm thinking, if everything was built using Javascript HTML Templates, then my app would serve just an HTML layout/shell, and a bunch of JSON. If you look at the Facebook and Twitter HTML source, that's basically what they're doing (95% json/javascript, 5% html). This would make it so my app only needed to cache JSON (pages, actions, and/or records). Which means you'd hit the cache no matter if you were some remote api developer accessing a JSON api, or the strait web app. That is, I don't need 2 caches, one for the JSON, one for the HTML. That seems like it'd cut my cache store down in half, and streamline things a little bit. On the other hand, I'm thinking, from what I've seen/experienced, generating static HTML server-side, and caching that, seems to be much better performance wise cross-browser; you get the graphics instantly and don't have to wait that split-second for javascript to render it. StackOverflow seems to do everything in plain HTML, and you can tell... everything appears at once. Notice how though on twitter.com, the page is blank for .5-1 seconds, and the page chunks in: the javascript has to render the json. The downside with this is that, for anything dynamic (like endless scrolling, or grids), I'd have to create javascript templates anyway... so now I have server-side HAML templates, client-side javascript templates, and a lot more to cache. My question is, is there any consensus on how to approach this? What are the benefits and drawbacks from your experience of mixing the two versus going 100% with one over the other? Update: Some reasons that factor into why I haven't yet made the decision to go with 100% javascript templating are: Performance. Haven't formally tested, but from what I've seen, raw html renders faster and more fluidly than javascript-generated html cross-browser. Plus, I'm not sure how mobile devices handle dynamic html performance-wise. Testing. I have a lot of integration tests that work well with static HTML, so switching to javascript-only would require 1) more focused pure-javascript testing (jasmine), and 2) integrating javascript into capybara integration tests. This is just a matter of time and work, but it's probably significant. Maintenance. Getting rid of HAML. I love HAML, it's so easy to write, it prints pretty HTML... It makes code clean, it makes maintenance easy. Going with javascript, there's nothing as concise. SEO. I know google handles the ajax /#!/path, but haven't grasped how this will affect other search engines and how older browsers handle it. Seems like it'd require a significant setup.

    Read the article

  • Permanently override background colour of Emacs theme

    - by John J. Camilleri
    I want to use the Emacs theme billw, except with a different background colour. I have the following in my .emacs file: (require 'color-theme) (color-theme-initialize) (color-theme-billw) (set-background-color "gray12") However this doesn't seem to change the background colour on startup; I need to manually run set-background-color "gray12" in the minibuffer at the beginning of each session. Any help with this? I tried creating my own custom theme based on the output of color-theme-print but this caused more problems than it's worth...

    Read the article

  • Ways to remotely reboot a Linux system

    - by dualed
    I had a remote server running Debian Sarge that experienced some HDD failure and I meant to reboot it hoping that fsck could repair the errors automatically. I eventually drove out there and replaced the faulty disks... But I was wondering: What other ways are there to force a Linux system to reboot that do not require hard drive access? What I had tried: shutdown -r now Did not work, as shutdown is a program that would have to be loaded from disk, the error shown in the terminal was bash: /sbin/shutdown: Input/output error init 6 same as above telinit q same as above kill -2 1 This did not print an error, but did not work either. (However, it is possible that the Sarge init did not implement SIGINT, the sarge manpages did not mention it. So it could work in a more recent version of Debian) This guide on PCFreak.net. However, this failed at sysctl, which was not in memory either.

    Read the article

  • How do I make a privileged port non-privileged in Redhat 5?

    - by Jason Thompson
    So I have a RedHat 5 box that I'm wanting to run an application that I wrote that implements SLP. SLP uses port 427 for answering service queries. My understanding is that ports below 1024 are "privileged" and thus cannot be bound to by anyone that's not root. I cannot run this application as root as it is launched via tomcat. One creative solution I really like was simply writing an iptables rule to route the privileged port to a non-privileged. In my proof of concept tests, this works wonderfully. Unfortunately, it would be greatly (and understandably) desired by the powers if my application did not require screwing around with iptables upon installation. So I heard a rumor and cannot find anything to verify this that there was some sort of command or parameter that could be set to make any port I want be non-privileged. Is this true? If so, how is this done? Thanks!

    Read the article

< Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >