Search Results

Search found 29159 results on 1167 pages for 'xml configuration'.

Page 445/1167 | < Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >

  • How do I get started with Chef?

    - by Brad Wright
    The chef documentation is pretty bad. And Google isn't helping me. Can anyone point me at a decent article or something that would help me get started? My specific issues are: How do I get a client to read my configuration? chef-solo seems like the best start (I don't want to run an OpenID server or Merb) How do I configure Apache to serve Django? I already know how to do this via regular server configuration, but I figure an example Chef recipe would be a good start;

    Read the article

  • Tutorial for Quick Look Generator for Mac

    - by vgm64
    I've checked out Apple's Quick Look Programming Guide: Introduction to Quick Look page in the Mac Dev Center, but as a more of a science programmer rather than an Apple programmer, it is a little over my head (but I could get through it in a weekend if I bash my head against it long enough). Does anyone know of a good basic Quick Look Generators tutorial that is simple enough for someone with only very modest experience with Xcode? For those that are curious, I have a filetype called .evt that has an xml header and then binary info after the header. I'm trying to write a generator to display the xml header. There's no application bundle that it belongs to. Thanks!

    Read the article

  • Problem after mysql-server installation I cant install any thing in ubuntu 12.04.1 now

    - by mohammed ezzi
    I'm not an advanced user of Linux and I tried to install work with database so I installed Mysql-server, I think I did same thing wrong so I get in trouble and now I cant install any thing and this what I get when I use apt-get -f install : root@me:~# apt-get -f install Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: mysql-server mysql-server-5.5 Suggested packages: tinyca mailx The following packages will be upgraded: mysql-server mysql-server-5.5 2 upgraded, 0 newly installed, 0 to remove and 194 not upgraded. 2 not fully installed or removed. Need to get 0 B/8,737 kB of archives. After this operation, 15.4 kB of additional disk space will be used. Do you want to continue [Y/n]? y dpkg: dependency problems prevent configuration of mysql-server-5.5: mysql-server-5.5 depends on mysql-server-core-5.5 (= 5.5.24-0ubuntu0.12.04.1); however: Version of mysql-server-core-5.5 on system is 5.5.28-0ubuntu0.12.04.3. dpkg: error processing mysql-server-5.5 (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of mysql-server: mysql-server depends on mysql-server-5.5; however: Package mysql-server-5.5 is not configured yet. dpkg: error processing mysql-server (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: mysql-server-5.5 mysql-server E: Sub-process /usr/bin/dpkg returned an error code (1) I tried to remove mysql-server but nothing happened.

    Read the article

  • Windows expand over 2 monitors in quad-monitor setup

    - by Martin
    i just installed ubuntu 11.10 with my previous hardware setup: 4 monitors and 2 identical nvidia graphic cards. draging windows around all 4 monitors works nice, but when i maximize a window it expand always over 2 screens. (2x twinview). i had an workaround for this in 11.04 but cant remember what it was... may one of you guys have quad monitors up and running with window maximizing on only one screen my xorg.conf looks like this: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 280.13 (buildd@allspice) Thu Aug 11 20:54:45 UTC 2011 # nvidia-xconfig: X configuration file generated by nvidia-xconfig # nvidia-xconfig: version 280.13 ([email protected]) Wed Jul 27 17:15:58 PDT 2011 Section "ServerLayout" # Removed Option "Xinerama" "1" # Removed Option "Xinerama" "0" Identifier "Layout0" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" 0 1080 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "1" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Samsung SMB2220N" HorizSync 31.0 - 80.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Monitor" Identifier "Monitor1" VendorName "Unknown" ModelName "Samsung SMB2220N" HorizSync 31.0 - 80.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX 550 Ti" BusID "PCI:2:0:0" EndSection Section "Device" Identifier "Device1" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX 550 Ti" BusID "PCI:3:0:0" EndSection Section "Screen" # Removed Option "TwinView" "True" # Removed Option "MetaModes" "nvidia-auto-select, nvidia-auto-select" # Removed Option "metamodes" "CRT-0: nvidia-auto-select 1920x1080 +0+0, CRT-1: nvidia-auto-select 1920x1080 +1920+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "CRT-0: nvidia-auto-select +0+0, CRT-1: nvidia-auto-select +1920+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" # Removed Option "TwinView" "True" # Removed Option "MetaModes" "nvidia-auto-select, nvidia-auto-select" # Removed Option "metamodes" "CRT-0: nvidia-auto-select 1920x1080 +0+0, CRT-1: nvidia-auto-select 1920x1080 +1920+0" Identifier "Screen1" Device "Device1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "CRT-0: nvidia-auto-select +0+0, CRT-1: nvidia-auto-select +1920+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Extensions" Option "Composite" "Disable" EndSection

    Read the article

  • How to start a task before networking?

    - by user1252434
    I've written an upstart task that modifies /etc/network/interfaces. (Actually a file sourced into it.) Which start on condition do I need to declare to let my task run before any networking jobs? I've tried start on starting networking, but that's apparently too late. When I log in after booting I can see that the changes were written, but obviously they are not used: the new config states a static IP, but the boot process waits for a non-existing DHCP server (old config) to time out. I've also tried start on starting network-interface INTERFACE=eth0, which didn't work either. IIRC there was an error in the log that the change couldn't be written. Background: I need a VM template that can be cloned and the clones configured through a script. Among other settings, I need to give them a static IP address to access them from the host. I use guestfish to write a config file to one of the virtual disks and let a script apply these settings to the system. I don't want that disk to contain an actual system settings file. I can't modify /etc directly, because that disk is shared (copy-on-write/diff) among the clones and guestfish apparently doesn't support that type of image. I could also let them use DHCP and setup a server that assigns IP by MAC, but I'm afraid of the complexity. I could also add just another virtual disk for configuration files, but if possible I'd prefer to store settings directly on the system disk image. Used software: Ubuntu Server 12.04, VirtualBox. The configuration modifier is a self written ruby script.

    Read the article

  • convert .htaccess to nginx

    - by Chip Gà Con
    It's me again :( I was trying to install siwapp on my webserver but I couldn't make it work with nginx, here is the .htaccess file content: RewriteCond %{REQUEST_FILENAME} !index.php RewriteRule (.*)\.php$ index.php/$1 RewriteCond $1 !^(index\.php|nhototamsu|assets|cache|xd_receiver\.html|photo|ipanel|automap|xajax_js|files|robots\.txt|favicon\.ico|ione\.ico|(.*)\.xml|ror\.xml|tool|google6afb981101589049\.html|googlec0d38cf2adbc25bc\.html|widget|iradio_admin|services|wsdl) RewriteRule ^(.*)$ index.php/$1 [QSA,L] When I access http://myurl.com/tin-tuc/tuyen-sinh/tu-van/2012/04/25757-phan-van-qua-giua-khoi-a1-va-khoi-a.html ,nginx could display the page correctly, it said: "404 Not Found" (new URL: http://myurl.com/tin-tuc/tuyen-sinh/tu-van/2012/04/25757-phan-van-qua-giua-khoi-a1-va-khoi-a.html)

    Read the article

  • Get Started using Build-Deploy-Test Workflow with TFS 2012

    - by Jakob Ehn
    TFS 2012 introduces a new type of Lab environment called Standard Environment. This allows you to setup a full Build Deploy Test (BDT) workflow that will build your application, deploy it to your target machine(s) and then run a set of tests on that server to verify the deployment. In TFS 2010, you had to use System Center Virtual Machine Manager and involve half of your IT department to get going. Now all you need is a server (virtual or physical) where you want to deploy and test your application. You don’t even have to install a test agent on the machine, TFS 2012 will do this for you! Although each step is rather simple, the entire process of setting it up consists of a bunch of steps. So I thought that it could be useful to run through a typical setup.I will also link to some good guidance from MSDN on each topic. High Level Steps Install and configure Visual Studio 2012 Test Controller on Target Server Create Standard Environment Create Test Plan with Test Case Run Test Case Create Coded UI Test from Test Case Associate Coded UI Test with Test Case Create Build Definition using LabDefaultTemplate 1. Install and Configure Visual Studio 2012 Test Controller on Target Server First of all, note that you do not have to have the Test Controller running on the target server. It can be running on another server, as long as the Test Agent can communicate with the test controller and the test controller can communicate with the TFS server. If you have several machines in your environment (web server, database server etc..), the test controller can be installed either on one of those machines or on a dedicated machine. To install the test controller, simply mount the Visual Studio Agents media on the server and browse to the vstf_controller.exe file located in the TestController folder. Run through the installation, you might need to reboot the server since it installs .NET 4.5. When the test controller is installed, the Test Controller configuration tool will launch automatically (if it doesn’t, you can start it from the Start menu). Here you will supply the credentials of the account running the test controller service. Note that this account will be given the necessary permissions in TFS during the configuration. Make sure that you have entered a valid account by pressing the Test link. Also, you have to register the test controller with the TFS collection where your test plan is located (and usually the code base of course) When you press Apply Settings, all the configuration will be done. You might get some warnings at the end, that might or might not cause a problem later. Be sure to read them carefully.   For more information about configuring your test controllers, see Setting Up Test Controllers and Test Agents to Manage Tests with Visual Studio 2. Create Standard Environment Now you need to create a Lab environment in Microsoft Test Manager. Since we are using an existing physical or virtual machine we will create a Standard Environment. Open MTM and go to Lab Center. Click New to create a new environment Enter a name for the environment. Since this environment will only contain one machine, we will use the machine name for the environment (TargetServer in this case) On the next page, click Add to add a machine to the environment. Enter the name of the machine (TargetServer.Domain.Com), and give it the Web Server role. The name must be reachable both from your machine during configuration and from the TFS app tier server. You also need to supply an account that is a local administration on the target server. This is needed in order to automatically install a test agent later on the machine. On the next page, you can add tags to the machine. This is not needed in this scenario so go to the next page. Here you will specify which test controller to use and that you want to run UI tests on this environment. This will in result in a Test Agent being automatically installed and configured on the target server. The name of the machine where you installed the test controller should be available on the drop down list (TargetServer in this sample). If you can’t see it, you might have selected a different TFS project collection. Press Next twice and then Verify to verify all the settings: Press finish. This will now create and prepare the environment, which means that it will remote install a test agent on the machine. As part of this installation, the remote server will be restarted. 3-5. Create Test Plan, Run Test Case, Create Coded UI Test I will not cover step 3-5 here, there are plenty of information on how you create test plans and test cases and automate them using Coded UI Tests. In this example I have a test plan called My Application and it contains among other things a test suite called Automated Tests where I plan to put test cases that should be automated and executed as part of the BDT workflow. For more information about Coded UI Tests, see Verifying Code by Using Coded User Interface Tests   6. Associate Coded UI Test with Test Case OK, so now we want to automate our Coded UI Test and have it run as part of the BDT workflow. You might think that you coded UI test already is automated, but the meaning of the term here is that you link your coded UI Test to an existing Test Case, thereby making the Test Case automated. And the test case should be part of the test suite that we will run during the BDT. Open the solution that contains the coded UI test method. Open the Test Case work item that you want to automate. Go to the Associated Automation tab and click on the “…” button. Select the coded UI test that you corresponds to the test case: Press OK and the save the test case For more information about associating an automated test case with a test case, see How to: Associate an Automated Test with a Test Case 7. Create Build Definition using LabDefaultTemplate Now we are ready to create a build definition that will implement the full BDT workflow. For this purpose we will use the LabDefaultTemplate.11.xaml that comes out of the box in TFS 2012. This build process template lets you take the output of another build and deploy it to each target machine. Since the deployment process will be running on the target server, you will have less problem with permissions and firewalls than if you were to remote deploy your solution. So, before creating a BDT workflow build definition, make sure that you have an existing build definition that produces a release build of your application. Go to the Builds hub in Team Explorer and select New Build Definition Give the build definition a meaningful name, here I called it MyApplication.Deploy Set the trigger to Manual Define a workspace for the build definition. Note that a BDT build doesn’t really need a workspace, since all it does is to launch another build definition and deploy the output of that build. But TFS doesn’t allow you to save a build definition without adding at least one mapping. On Build Defaults, select the build controller. Since this build actually won’t produce any output, you can select the “This build does not copy output files to a drop folder” option. On the process tab, select the LabDefaultTemplate.11.xaml. This is usually located at $/TeamProject/BuildProcessTemplates/LabDefaultTemplate.11.xaml. To configure it, press the … button on the Lab Process Settings property First, select the environment that you created before: Select which build that you want to deploy and test. The “Select an existing build” option is very useful when developing the BDT workflow, because you do not have to run through the target build every time, instead it will basically just run through the deployment and test steps which speeds up the process. Here I have selected to queue a new build of the MyApplication.Test build definition On the deploy tab, you need to specify how the application should be installed on the target server. You can supply a list of deployment scripts with arguments that will be executed on the target server. In this example I execute the generated web deploy command file to deploy the solution. If you for example have databases you can use sqlpackage.exe to deploy the database. If you are producing MSI installers in your build, you can run them using msiexec.exe and so on. A good practice is to create a batch file that contain the entire deployment that you can run both locally and on the target server. Then you would just execute the deployment batch file here in one single step. The workflow defines some variables that are useful when running the deployments. These variables are: $(BuildLocation) The full path to where your build files are located $(InternalComputerName_<VM Name>) The computer name for a virtual machine in a SCVMM environment $(ComputerName_<VM Name>) The fully qualified domain name of the virtual machine As you can see, I specify the path to the myapplication.deploy.cmd file using the $(BuildLocation) variable, which is the drop folder of the MyApplication.Test build. Note: The test agent account must have read permission in this drop location. You can find more information here on Building your Deployment Scripts On the last tab, we specify which tests to run after deployment. Here I select the test plan and the Automated Tests test suite that we saw before: Note that I also selected the automated test settings (called TargetServer in this case) that I have defined for my test plan. In here I define what data that should be collected as part of the test run. For more information about test settings, see Specifying Test Settings for Microsoft Test Manager Tests We are done! Queue your BDT build and wait for it to finish. If the build succeeds, your build summary should look something like this:

    Read the article

  • dansguardian error: filterports must match number of filterips (pfsense)

    - by Bulki
    Hi I'm setting up pfsense with squid3 and dansguardian packages. When I try to start the dansguardian service however, I get the following errors: May 27 22:17:37 php: /pkg_edit.php: The command '/usr/local/etc/rc.d/dansguardian.sh start' returned exit code '1', the output was 'kern.ipc.somaxconn: 16384 -> 16384 kern.maxfiles: 131072 -> 131072 kern.maxfilesperproc: 104856 -> 104856 kern.threads.max_threads_per_proc: 4096 -> 4096 Starting dansguardian. filterports (2) must match number of filterips (1) Error parsing the dansguardian.conf file or other DansGuardian configuration files /usr/local/etc/rc.d/dansguardian.sh: WARNING: failed to start dansguardian' May 27 22:17:37 root: /usr/local/etc/rc.d/dansguardian.sh: WARNING: failed to start dansguardian May 27 22:17:37 dansguardian[52944]: Error parsing the dansguardian.conf file or other DansGuardian configuration files May 27 22:17:37 dansguardian[52944]: filterports must match number of filterips What does "filterports must match number of filterips" mean? Any thoughts on the matter?

    Read the article

  • How to make HOME or END keys work in mc running on OS X (ssh)

    - by Sorin Sbarnea
    I installed MacPorts on OS X 10.5 and I found out that when I connect to the computer using SSH and use mc - Midnight Commander - the HOME and END keys do not work. I have to mention that I'm using putty and I am able to use the keyboard very well on Linux machines like Fedora, Ubuntu,... Here is putty keyboard configuration (a configuration I found to be optimal over time): Backspace key: 127 Home/End keys: Standard Function keys: Xterm R6 Cursor keys: Normal Numpad: normal Terminal type string: xterm-color I'm looking for a command line solution/script that does these changes, this make much easier to create a prepare OS script for configuring a new OS.

    Read the article

  • Save all music files in a VLC xspf playlist to another folder

    - by Parto
    I have a VLC playlist (.xspf) of over a 100 songs all scattered all over my computer. I'm looking for a way to save this playlist and all it's songs to another folder - flash drive, external drive or just a different location in my computer. How can I do this? EDIT The xspf playlist is in XML and is such a format: <?xml version="1.0" encoding="UTF-8"?> <playlist xmlns="http://xspf.org/ns/0/" xmlns:vlc="http://www.videolan.org/vlc/playlist/ns/0/" version="1"> <title>Playlist</title> <trackList> <track> <location>file:///home/subroot/Music/3%20Days%20Grace%20-%20Wake%20Up.mp3</location> <title>Wake Up</title> <creator>3 Days Grace</creator> <album>Three Days Grace</album> <trackNum>10</trackNum> <annotation> </annotation> <duration>206036</duration> <extension application="http://www.videolan.org/vlc/playlist/0"> <vlc:id>0</vlc:id> </extension> </track> . . [Many more tracks here] . </trackList> <extension application="http://www.videolan.org/vlc/playlist/0"> <vlc:item tid="0"/> . . [Other id's here] . </extension> </playlist>

    Read the article

  • Async ignored on AJAX requests on Nginx server

    - by eComEvo
    Despite sending an async request to the server over AJAX, the server will not respond until the previous unrelated request has finished. The following code is only broken in this way on Nginx, but runs perfectly on Apache. This call will start a background process and it waits for it to complete so it can display the final result. $.ajax({ type: 'GET', async: true, url: $(this).data('route'), data: $('input[name=data]').val(), dataType: 'json', success: function (data) { /* do stuff */} error: function (data) { /* handle errors */} }); The below is called after the above, which on Apache requires 100ms to execute and repeats itself, showing progress for data being written in the background: checkStatusInterval = setInterval(function () { $.ajax({ type: 'GET', async: false, cache: false, url: '/process-status?process=' + currentElement.attr('id'), dataType: 'json', success: function (data) { /* update progress bar and status message */ } }); }, 1000); Unfortunately, when this script is run from nginx, the above progress request never even finishes a single request until the first AJAX request that sent the data is done. If I change the async to TRUE in the above, it executes one every interval, but none of them complete until that very first AJAX request finishes. Here is the main nginx conf file: #user nobody; worker_processes 1; #error_log logs/error.log; #error_log logs/error.log notice; #error_log logs/error.log info; #pid logs/nginx.pid; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; server_names_hash_bucket_size 64; # configure temporary paths # nginx is started with param -p, setting nginx path to serverpack installdir fastcgi_temp_path temp/fastcgi; uwsgi_temp_path temp/uwsgi; scgi_temp_path temp/scgi; client_body_temp_path temp/client-body 1 2; proxy_temp_path temp/proxy; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; #access_log logs/access.log main; # Sendfile copies data between one FD and other from within the kernel. # More efficient than read() + write(), since the requires transferring data to and from the user space. sendfile on; # Tcp_nopush causes nginx to attempt to send its HTTP response head in one packet, # instead of using partial frames. This is useful for prepending headers before calling sendfile, # or for throughput optimization. tcp_nopush on; # don't buffer data-sends (disable Nagle algorithm). Good for sending frequent small bursts of data in real time. tcp_nodelay on; types_hash_max_size 2048; # Timeout for keep-alive connections. Server will close connections after this time. keepalive_timeout 90; # Number of requests a client can make over the keep-alive connection. This is set high for testing. keepalive_requests 100000; # allow the server to close the connection after a client stops responding. Frees up socket-associated memory. reset_timedout_connection on; # send the client a "request timed out" if the body is not loaded by this time. Default 60. client_header_timeout 20; client_body_timeout 60; # If the client stops reading data, free up the stale client connection after this much time. Default 60. send_timeout 60; # Size Limits client_body_buffer_size 64k; client_header_buffer_size 4k; client_max_body_size 8M; # FastCGI fastcgi_connect_timeout 60; fastcgi_send_timeout 120; fastcgi_read_timeout 300; # default: 60 secs; when step debugging with XDEBUG, you need to increase this value fastcgi_buffer_size 64k; fastcgi_buffers 4 64k; fastcgi_busy_buffers_size 128k; fastcgi_temp_file_write_size 128k; # Caches information about open FDs, freqently accessed files. open_file_cache max=200000 inactive=20s; open_file_cache_valid 30s; open_file_cache_min_uses 2; open_file_cache_errors on; # Turn on gzip output compression to save bandwidth. # http://wiki.nginx.org/HttpGzipModule gzip on; gzip_disable "MSIE [1-6]\.(?!.*SV1)"; gzip_http_version 1.1; gzip_vary on; gzip_proxied any; #gzip_proxied expired no-cache no-store private auth; gzip_comp_level 6; gzip_buffers 16 8k; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript application/javascript; # show all files and folders autoindex on; server { # access from localhost only listen 127.0.0.1:80; server_name localhost; root www; # the following default "catch-all" configuration, allows access to the server from outside. # please ensure your firewall allows access to tcp/port 80. check your "skype" config. # listen 80; # server_name _; log_not_found off; charset utf-8; access_log logs/access.log main; # handle files in the root path /www location / { index index.php index.html index.htm; } #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # error_page 500 502 503 504 /50x.html; location = /50x.html { root www; } # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9100 # location ~ \.php$ { try_files $uri =404; fastcgi_pass 127.0.0.1:9100; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } # add expire headers location ~* ^.+.(gif|ico|jpg|jpeg|png|flv|swf|pdf|mp3|mp4|xml|txt|js|css)$ { expires 30d; } # deny access to .htaccess files (if Apache's document root concurs with nginx's one) # deny access to git & svn repositories location ~ /(\.ht|\.git|\.svn) { deny all; } } # include config files of "enabled" domains include domains-enabled/*.conf; } Here is the enabled domain conf file: access_log off; access_log C:/server/www/test.dev/logs/access.log; error_log C:/server/www/test.dev/logs/error.log; # HTTP Server server { listen 127.0.0.1:80; server_name test.dev; root C:/server/www/test.dev/public; index index.php; rewrite_log on; default_type application/octet-stream; #include /etc/nginx/mime.types; # Include common configurations. include domains-common/location.conf; } # HTTPS server server { listen 443 ssl; server_name test.dev; root C:/server/www/test.dev/public; index index.php; rewrite_log on; default_type application/octet-stream; #include /etc/nginx/mime.types; # Include common configurations. include domains-common/location.conf; include domains-common/ssl.conf; } Contents of ssl.conf: # OpenSSL for HTTPS connections. ssl on; ssl_certificate C:/server/bin/openssl/certs/cert.pem; ssl_certificate_key C:/server/bin/openssl/certs/cert.key; ssl_session_timeout 5m; ssl_protocols SSLv3 TLSv1 TLSv1.1 TLSv1.2; ssl_ciphers HIGH:!aNULL:!MD5; ssl_prefer_server_ciphers on; # Pass the PHP scripts to FastCGI server listening on 127.0.0.1:9100 location ~ \.php$ { try_files $uri =404; fastcgi_param HTTPS on; fastcgi_pass 127.0.0.1:9100; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } Contents of location.conf: # Remove trailing slash to please Laravel routing system. if (!-d $request_filename) { rewrite ^/(.+)/$ /$1 permanent; } location / { try_files $uri $uri/ /index.php?$query_string; } # We don't need .ht files with nginx. location ~ /(\.ht|\.git|\.svn) { deny all; } # Added cache headers for images. location ~* \.(png|jpg|jpeg|gif)$ { expires 30d; log_not_found off; } # Only 3 hours on CSS/JS to allow me to roll out fixes during early weeks. location ~* \.(js|css)$ { expires 3h; log_not_found off; } # Add expire headers. location ~* ^.+.(gif|ico|jpg|jpeg|png|flv|swf|pdf|mp3|mp4|xml|txt)$ { expires 30d; } # Pass the PHP scripts to FastCGI server listening on 127.0.0.1:9100 location ~ \.php$ { try_files $uri /index.php =404; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; fastcgi_pass 127.0.0.1:9100; } Any ideas where this is going wrong?

    Read the article

  • Problem using a public key when connecting to a SSH server running on Cygwin

    - by Deleted
    We have installed Cygwin on a Windows Server 2008 Standard server and it working pretty well. Unfortunately we still have a big problem. We want to connect using a public key through SSH which doesn't work. It always falls back to using password login. We have appended our public key to ~/.ssh/authorized_keys on the server and we have our private and public key in ~/.ssh/id_dsa respective ~/.ssh/id_dsa.pub on the client. When debugging the SSH login session we see that the key is offered by the server apparently rejects it by some unknown reason. The SSH output when connecting from an Ubuntu 9.10 desktop with debug information enabled: $ ssh -v 192.168.10.11 OpenSSH_5.1p1 Debian-6ubuntu2, OpenSSL 0.9.8g 19 Oct 2007 debug1: Reading configuration data /home/myuseraccount/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Applying options for debug1: Connecting to 192.168.10.11 [192.168.10.11] port 22. debug1: Connection established. debug1: identity file /home/myuseraccount/.ssh/identity type -1 debug1: identity file /home/myuseraccount/.ssh/id_rsa type -1 debug1: identity file /home/myuseraccount/.ssh/id_dsa type 2 debug1: Checking blacklist file /usr/share/ssh/blacklist.DSA-1024 debug1: Checking blacklist file /etc/ssh/blacklist.DSA-1024 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.3 debug1: match: OpenSSH_5.3 pat OpenSSH debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.1p1 Debian-6ubuntu2 debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug1: kex: server->client aes128-cbc hmac-md5 none debug1: kex: client->server aes128-cbc hmac-md5 none debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY debug1: Host '192.168.10.11' is known and matches the RSA host key. debug1: Found key in /home/myuseraccount/.ssh/known_hosts:12 debug1: ssh_rsa_verify: signature correct debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey,password,keyboard-interactive debug1: Next authentication method: publickey debug1: Offering public key: /home/myuseraccount/.ssh/id_dsa debug1: Authentications that can continue: publickey,password,keyboard-interactive debug1: Trying private key: /home/myuseraccount/.ssh/identity debug1: Trying private key: /home/myuseraccount/.ssh/id_rsa debug1: Next authentication method: keyboard-interactive debug1: Authentications that can continue: publickey,password,keyboard-interactive debug1: Next authentication method: password [email protected]'s password: The version of Cygwin: $ uname -a CYGWIN_NT-6.0 servername 1.7.1(0.218/5/3) 2009-12-07 11:48 i686 Cygwin The installed packages: $ cygcheck -c Cygwin Package Information Package Version Status _update-info-dir 00871-1 OK alternatives 1.3.30c-10 OK arj 3.10.22-1 OK aspell 0.60.5-1 OK aspell-en 6.0.0-1 OK aspell-sv 0.50.2-2 OK autossh 1.4b-1 OK base-cygwin 2.1-1 OK base-files 3.9-3 OK base-passwd 3.1-1 OK bash 3.2.49-23 OK bash-completion 1.1-2 OK bc 1.06-2 OK bzip2 1.0.5-10 OK cabextract 1.1-1 OK compface 1.5.2-1 OK coreutils 7.0-2 OK cron 4.1-59 OK crypt 1.1-1 OK csih 0.9.1-1 OK curl 7.19.6-1 OK cvs 1.12.13-10 OK cvsutils 0.2.5-1 OK cygrunsrv 1.34-1 OK cygutils 1.4.2-1 OK cygwin 1.7.1-1 OK cygwin-doc 1.5-1 OK cygwin-x-doc 1.1.0-1 OK dash 0.5.5.1-2 OK diffutils 2.8.7-2 OK doxygen 1.6.1-2 OK e2fsprogs 1.35-3 OK editrights 1.01-2 OK emacs 23.1-10 OK emacs-X11 23.1-10 OK file 5.04-1 OK findutils 4.5.5-1 OK flip 1.19-1 OK font-adobe-dpi75 1.0.1-1 OK font-alias 1.0.2-1 OK font-encodings 1.0.3-1 OK font-misc-misc 1.1.0-1 OK fontconfig 2.8.0-1 OK gamin 0.1.10-10 OK gawk 3.1.7-1 OK gettext 0.17-11 OK gnome-icon-theme 2.28.0-1 OK grep 2.5.4-2 OK groff 1.19.2-2 OK gvim 7.2.264-1 OK gzip 1.3.12-2 OK hicolor-icon-theme 0.11-1 OK inetutils 1.5-6 OK ipc-utils 1.0-1 OK keychain 2.6.8-1 OK less 429-1 OK libaspell15 0.60.5-1 OK libatk1.0_0 1.28.0-1 OK libaudio2 1.9.2-1 OK libbz2_1 1.0.5-10 OK libcairo2 1.8.8-1 OK libcurl4 7.19.6-1 OK libdb4.2 4.2.52.5-2 OK libdb4.5 4.5.20.2-2 OK libexpat1 2.0.1-1 OK libfam0 0.1.10-10 OK libfontconfig1 2.8.0-1 OK libfontenc1 1.0.5-1 OK libfreetype6 2.3.12-1 OK libgcc1 4.3.4-3 OK libgdbm4 1.8.3-20 OK libgdk_pixbuf2.0_0 2.18.6-1 OK libgif4 4.1.6-10 OK libGL1 7.6.1-1 OK libglib2.0_0 2.22.4-2 OK libglitz1 0.5.6-10 OK libgmp3 4.3.1-3 OK libgtk2.0_0 2.18.6-1 OK libICE6 1.0.6-1 OK libiconv2 1.13.1-1 OK libidn11 1.16-1 OK libintl3 0.14.5-1 OK libintl8 0.17-11 OK libjasper1 1.900.1-1 OK libjbig2 2.0-11 OK libjpeg62 6b-21 OK libjpeg7 7-10 OK liblzma1 4.999.9beta-10 OK libncurses10 5.7-18 OK libncurses8 5.5-10 OK libncurses9 5.7-16 OK libopenldap2_3_0 2.3.43-1 OK libpango1.0_0 1.26.2-1 OK libpcre0 8.00-1 OK libpixman1_0 0.16.6-1 OK libpng12 1.2.35-10 OK libpopt0 1.6.4-4 OK libpq5 8.2.11-1 OK libreadline6 5.2.14-12 OK libreadline7 6.0.3-2 OK libsasl2 2.1.19-3 OK libSM6 1.1.1-1 OK libssh2_1 1.2.2-1 OK libssp0 4.3.4-3 OK libstdc++6 4.3.4-3 OK libtiff5 3.9.2-1 OK libwrap0 7.6-20 OK libX11_6 1.3.3-1 OK libXau6 1.0.5-1 OK libXaw3d7 1.5D-8 OK libXaw7 1.0.7-1 OK libxcb-render-util0 0.3.6-1 OK libxcb-render0 1.5-1 OK libxcb1 1.5-1 OK libXcomposite1 0.4.1-1 OK libXcursor1 1.1.10-1 OK libXdamage1 1.1.2-1 OK libXdmcp6 1.0.3-1 OK libXext6 1.1.1-1 OK libXfixes3 4.0.4-1 OK libXft2 2.1.14-1 OK libXi6 1.3-1 OK libXinerama1 1.1-1 OK libxkbfile1 1.0.6-1 OK libxml2 2.7.6-1 OK libXmu6 1.0.5-1 OK libXmuu1 1.0.5-1 OK libXpm4 3.5.8-1 OK libXrandr2 1.3.0-10 OK libXrender1 0.9.5-1 OK libXt6 1.0.7-1 OK links 1.00pre20-1 OK login 1.10-10 OK luit 1.0.5-1 OK lynx 2.8.5-4 OK man 1.6e-1 OK minires 1.02-1 OK mkfontdir 1.0.5-1 OK mkfontscale 1.0.7-1 OK openssh 5.4p1-1 OK openssl 0.9.8m-1 OK patch 2.5.8-9 OK patchutils 0.3.1-1 OK perl 5.10.1-3 OK rebase 3.0.1-1 OK run 1.1.12-11 OK screen 4.0.3-5 OK sed 4.1.5-2 OK shared-mime-info 0.70-1 OK tar 1.22.90-1 OK terminfo 5.7_20091114-13 OK terminfo0 5.5_20061104-11 OK texinfo 4.13-3 OK tidy 041206-1 OK time 1.7-2 OK tzcode 2009k-1 OK unzip 6.0-10 OK util-linux 2.14.1-1 OK vim 7.2.264-2 OK wget 1.11.4-4 OK which 2.20-2 OK wput 0.6.1-2 OK xauth 1.0.4-1 OK xclipboard 1.1.0-1 OK xcursor-themes 1.0.2-1 OK xemacs 21.4.22-1 OK xemacs-emacs-common 21.4.22-1 OK xemacs-sumo 2007-04-27-1 OK xemacs-tags 21.4.22-1 OK xeyes 1.1.0-1 OK xinit 1.2.1-1 OK xinput 1.5.0-1 OK xkbcomp 1.1.1-1 OK xkeyboard-config 1.8-1 OK xkill 1.0.2-1 OK xmodmap 1.0.4-1 OK xorg-docs 1.5-1 OK xorg-server 1.7.6-2 OK xrdb 1.0.6-1 OK xset 1.1.0-1 OK xterm 255-1 OK xz 4.999.9beta-10 OK zip 3.0-11 OK zlib 1.2.3-10 OK zlib-devel 1.2.3-10 OK zlib0 1.2.3-10 OK The ssh deamon configuration file: $ cat /etc/sshd_config # $OpenBSD: sshd_config,v 1.80 2008/07/02 02:24:18 djm Exp $ # This is the sshd server system-wide configuration file. See # sshd_config(5) for more information. # This sshd was compiled with PATH=/bin:/usr/sbin:/sbin:/usr/bin # The strategy used for options in the default sshd_config shipped with # OpenSSH is to specify options with their default value where # possible, but leave them commented. Uncommented options change a # default value. Port 22 #AddressFamily any #ListenAddress 0.0.0.0 #ListenAddress :: # Disable legacy (protocol version 1) support in the server for new # installations. In future the default will change to require explicit # activation of protocol 1 Protocol 2 # HostKey for protocol version 1 #HostKey /etc/ssh_host_key # HostKeys for protocol version 2 #HostKey /etc/ssh_host_rsa_key #HostKey /etc/ssh_host_dsa_key # Lifetime and size of ephemeral version 1 server key #KeyRegenerationInterval 1h #ServerKeyBits 1024 # Logging # obsoletes QuietMode and FascistLogging #SyslogFacility AUTH #LogLevel INFO # Authentication: #LoginGraceTime 2m #PermitRootLogin yes StrictModes no #MaxAuthTries 6 #MaxSessions 10 RSAAuthentication yes PubkeyAuthentication yes AuthorizedKeysFile .ssh/authorized_keys # For this to work you will also need host keys in /etc/ssh_known_hosts #RhostsRSAAuthentication no # similar for protocol version 2 #HostbasedAuthentication no # Change to yes if you don't trust ~/.ssh/known_hosts for # RhostsRSAAuthentication and HostbasedAuthentication #IgnoreUserKnownHosts no # Don't read the user's ~/.rhosts and ~/.shosts files #IgnoreRhosts yes # To disable tunneled clear text passwords, change to no here! #PasswordAuthentication yes #PermitEmptyPasswords no # Change to no to disable s/key passwords #ChallengeResponseAuthentication yes # Kerberos options #KerberosAuthentication no #KerberosOrLocalPasswd yes #KerberosTicketCleanup yes #KerberosGetAFSToken no # GSSAPI options #GSSAPIAuthentication no #GSSAPICleanupCredentials yes # Set this to 'yes' to enable PAM authentication, account processing, # and session processing. If this is enabled, PAM authentication will # be allowed through the ChallengeResponseAuthentication and # PasswordAuthentication. Depending on your PAM configuration, # PAM authentication via ChallengeResponseAuthentication may bypass # the setting of "PermitRootLogin without-password". # If you just want the PAM account and session checks to run without # PAM authentication, then enable this but set PasswordAuthentication # and ChallengeResponseAuthentication to 'no'. #UsePAM no AllowAgentForwarding yes AllowTcpForwarding yes GatewayPorts yes X11Forwarding yes X11DisplayOffset 10 X11UseLocalhost no #PrintMotd yes #PrintLastLog yes TCPKeepAlive yes #UseLogin no UsePrivilegeSeparation yes #PermitUserEnvironment no #Compression delayed #ClientAliveInterval 0 #ClientAliveCountMax 3 #UseDNS yes #PidFile /var/run/sshd.pid #MaxStartups 10 #PermitTunnel no #ChrootDirectory none # no default banner path #Banner none # override default of no subsystems Subsystem sftp /usr/sbin/sftp-server # Example of overriding settings on a per-user basis #Match User anoncvs #X11Forwarding yes #AllowTcpForwarding yes #ForceCommand cvs server I hope this information is enough to solve the problem. In case any more is needed please comment and I'll add it. Thank you for reading!

    Read the article

  • How to disable "Automatically search for drivers and install" in Windows 7

    - by raegadfsgfsdg
    I connected my TI-89 to my Windows 7 system, and apparently it is not plug-and-play. There was a link to a Windows configuration setting made available that gave the option to automatically search their windows site for drivers and automatically install them each time i connect an unrecognized device. Well, I selected that link with the intention of it being temporary and just that one time. That didnt work and the location/install of a driver was not successful. How can I reset that setting to not automatically install whatever drivers it thinks might be fitting? I cannot find that configuration setting when I looked for it recently.

    Read the article

  • Ignoring GET parameters in Varnish VCL

    - by JamesHarrison
    Okay: I've got a site set up which has some APIs we expose to developers, which are in the format /api/item.xml?type_ids=34,35,37&region_ids=1000002,1000003&key=SOMERANDOMALPHANUM In this URI, type_ids is always set, region_ids and key are optional. The important thing to note is that the key variable does not affect the content of the response. It is used for internal tracking of requests so we can identify people who make slow or otherwise unwanted requests. In Varnish, we have a VCL like this: if (req.http.host ~ "the-site-in-question.com") { if (req.url ~ "^/api/.+\.xml") { unset req.http.cookie; } } We just strip cookies out and let the backend do the rest as far as times are concerned (this is a hackaround since Rails/authlogic sends session cookies with API responses). At present though, any distinct developers are basically hitting different caches since &key=SOMEALPHANUM is considered as part of the Varnish hash for storage. This is obviously not a great solution and I'm trying to work out how to tell Varnish to ignore that part of the URI.

    Read the article

  • Hosting 2 different SSL domains on the same IP [closed]

    - by Jim
    Possible Duplicate: Multiple SSL domains on the same IP address and same port? I have 2 different domains, domain1.com and domain2.net. Both require SSL certificates. I've got the certificates signed by a CA and now need to set them up on Apache. I've got a single IP address and have found a post here: SSL site not using the correct IP in Apache and Ubuntu that apparently works for ubuntu but I am on CentOS and have not had the same luck trying that configuration. When I try that configuration, Apache dies telling me that the port is already in use: (98)Address already in use: make_sock: could not bind to address 0.0.0.0:443 I had SSL working for a single domain but now that I've added the second, I get a browser warning about the cert not belonging to the domain. How do I get both domains to work using virtual host containers?

    Read the article

  • Tutuorial for Quick Look Generator for Mac

    - by vgm64
    I've checked out Apple's Quick Look Programming Guide: Introduction to Quick Look page in the Mac Dev Center, but as a more of a science programmer rather than an Apple programmer, it is a little over my head (but I could get through it in a weekend if I bash my head against it long enough). Does anyone know of a good basic Quick Look Generators tutorial that is simple enough for someone with only very modest experience with Xcode? For those that are curious, I have a filetype called .evt that has an xml header and then binary info after the header. I'm trying to write a generator to display the xml header. There's no application bundle that it belongs to. Thanks!

    Read the article

  • How do I stop XNA/Visual Studio from rebuilding my content project every time I build?

    - by Phil Quinn
    My group and I are working on a game in XNA 4.0 with Visual Studio 2010/2012. The main solution has 6 projects: 2 XNA game projects (1 executable/ 1 class library), 1 WPF executable for the level editor, 2 standard class libraries, and a content project. Originally, the editor and engine XNA game projects had a content reference to separate content projects. Recently, I consolidated the content projects into one to simplify asset additions. Since pushing these changes to our git repo, certain members of my group have been experiencing weird build issues. Every time they run the project, they have to re-build all of the assets. This happens regardless of whether any changes were made, even if they just run the project directly after building. I've taken a few steps to figure out why this is happening. Below is the MSBuild output set on Normal verbosity. The seemingly important part is at 4, with the line 4> Rebuilding all content because build settings have changed 1>------ Build started: Project: Engine.Core, Configuration: Debug x86 ------ 1>Build started 11/29/2012 3:24:24 AM. 1>ResolveAssemblyReferences: 1> A TargetFramework profile exclusion list will be generated. 1>EmbedXnaFrameworkRuntimeProfile: 1>Skipping target "EmbedXnaFrameworkRuntimeProfile" because all output files are up-to-date with respect to the input files. 1>GenerateTargetFrameworkMonikerAttribute: 1>Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the input files. 1>CoreCompile: 1>Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files. 1>XnaWriteCacheFile: 1>Skipping target "XnaWriteCacheFile" because all output files are up-to-date with respect to the input files. 1>_CopyOutOfDateSourceItemsToOutputDirectoryAlways: 1> Copying file from "<solution-dir>\src\Engine.Core\DialoguePrototypeTestDB.s3db" to "bin\x86\Debug\DialoguePrototypeTestDB.s3db". 1>_CopyAppConfigFile: 1>Skipping target "_CopyAppConfigFile" because all output files are up-to-date with respect to the input files. 1>CopyFilesToOutputDirectory: 1> Engine.Core -> <solution-dir>\src\Engine.Core\bin\x86\Debug\TimeSink.Engine.Core.dll 1> 1>Build succeeded. 1> 1>Time Elapsed 00:00:00.13 2>------ Build started: Project: TimeSink.Entities, Configuration: Debug x86 ------ 2>Build started 11/29/2012 3:24:25 AM. 2>ResolveAssemblyReferences: 2> A TargetFramework profile exclusion list will be generated. 2>EmbedXnaFrameworkRuntimeProfile: 2>Skipping target "EmbedXnaFrameworkRuntimeProfile" because all output files are up-to-date with respect to the input files. 2>GenerateTargetFrameworkMonikerAttribute: 2>Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the input files. 2>CoreCompile: 2>Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files. 2>XnaWriteCacheFile: 2>Skipping target "XnaWriteCacheFile" because all output files are up-to-date with respect to the input files. 2>_CopyOutOfDateSourceItemsToOutputDirectoryAlways: 2> Copying file from "<solution-dir>\src\Engine.Core\DialoguePrototypeTestDB.s3db" to "bin\x86\Debug\DialoguePrototypeTestDB.s3db". 2>CopyFilesToOutputDirectory: 2> TimeSink.Entities -> <solution-dir>\src\TimeSink.Entities\bin\x86\Debug\TimeSink.Entities.dll 2> 2>Build succeeded. 2> 2>Time Elapsed 00:00:00.11 3>------ Build started: Project: Editor (Editor\Editor), Configuration: Debug x86 ------ 4>------ Build started: Project: Engine.Game, Configuration: Debug x86 ------ 3>Build started 11/29/2012 3:24:25 AM. 3>CoreCompile: 3> All content is already up to date 3>ResolveAssemblyReferences: 3> A TargetFramework profile exclusion list will be generated. 3>EmbedXnaFrameworkRuntimeProfile: 3>Skipping target "EmbedXnaFrameworkRuntimeProfile" because all output files are up-to-date with respect to the input files. 3>GenerateTargetFrameworkMonikerAttribute: 3>Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the input files. 3>CoreCompile: 3>Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files. 3>XnaWriteCacheFile: 3>Skipping target "XnaWriteCacheFile" because all output files are up-to-date with respect to the input files. 3>_CopyOutOfDateSourceItemsToOutputDirectoryAlways: 3> Copying file from "<solution-dir>\src\Engine.Core\DialoguePrototypeTestDB.s3db" to "bin\x86\Debug\DialoguePrototypeTestDB.s3db". 3>_CopyOutOfDateNestedContentItemsToOutputDirectory: 3>Skipping target "_CopyOutOfDateNestedContentItemsToOutputDirectory" because all output files are up-to-date with respect to the input files. 3>CopyFilesToOutputDirectory: 3> Editor -> <solution-dir>\src\Editor\Editor\bin\x86\Debug\Editor.dll 3> 3>Build succeeded. 3> 3>Time Elapsed 00:00:00.39 4>Build started 11/29/2012 3:24:25 AM. 4>CoreCompile: 4> Rebuilding all content because build settings have changed 4> Building Textures\circle.png -> <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\Textures\circle.xnb 4> Importing Textures\circle.png with Microsoft.Xna.Framework.Content.Pipeline.TextureImporter 4> Processing Textures\circle.png with Microsoft.Xna.Framework.Content.Pipeline.Processors.TextureProcessor 4> Compiling <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\Textures\circle.xnb 4> Building Textures\giroux.png -> <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\Textures\giroux.xnb 4> Importing Textures\giroux.png with Microsoft.Xna.Framework.Content.Pipeline.TextureImporter 4> Processing Textures\giroux.png with Microsoft.Xna.Framework.Content.Pipeline.Processors.TextureProcessor 4> Compiling <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\Textures\giroux.xnb 4> Building Textures\Body_Neutral.png -> <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\Textures\Body_Neutral.xnb 4> Importing Textures\Body_Neutral.png with Microsoft.Xna.Framework.Content.Pipeline.TextureImporter 4> Processing Textures\Body_Neutral.png with Microsoft.Xna.Framework.Content.Pipeline.Processors.TextureProcessor 4> Compiling <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\Textures\Body_Neutral.xnb 4> Building font.spritefont -> <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\font.xnb 4> Importing font.spritefont with Microsoft.Xna.Framework.Content.Pipeline.FontDescriptionImporter 4> Processing font.spritefont with Microsoft.Xna.Framework.Content.Pipeline.Processors.FontDescriptionProcessor 4> Compiling <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Content\font.xnb 4>ResolveAssemblyReferences: 4> A TargetFramework profile exclusion list will be generated. 4>EmbedXnaFrameworkRuntimeProfile: 4>Skipping target "EmbedXnaFrameworkRuntimeProfile" because all output files are up-to-date with respect to the input files. 4>GenerateTargetFrameworkMonikerAttribute: 4>Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the input files. 4>CoreCompile: 4>Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files. 4>_CopyOutOfDateSourceItemsToOutputDirectoryAlways: 4> Copying file from "<solution-dir>\src\Engine.Core\DialoguePrototypeTestDB.s3db" to "bin\x86\Debug\DialoguePrototypeTestDB.s3db". 4>_CopyOutOfDateNestedContentItemsToOutputDirectory: 4>Skipping target "_CopyOutOfDateNestedContentItemsToOutputDirectory" because all output files are up-to-date with respect to the input files. 4>_CopyAppConfigFile: 4>Skipping target "_CopyAppConfigFile" because all output files are up-to-date with respect to the input files. 4>CopyFilesToOutputDirectory: 4> Engine.Game -> <solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Engine.Game.exe 4>IncrementalClean: 4> Deleting file "<solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\circle.xnb". 4> Deleting file "<solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\giroux.xnb". 4> Deleting file "<solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\Body_Neutral.xnb". 4> Deleting file "<solution-dir>\src\Engine.Game\Engine.Game\bin\x86\Debug\font.xnb". 4> 4>Build succeeded. 4> 4>Time Elapsed 00:00:01.72 ========== Build: 4 succeeded, 0 failed, 1 up-to-date, 0 skipped ========== I can't think of how build settings could change between consecutive executions. Like I said, this only happens for half our group. One member is on a 32-bit Windows 7 Prof bootcamp partition on a Mac. Everyone else, including those who don't have the issue, are running straight 64-bit Windows 7 Prof. Both have tried using VS 2010 and VS 2012. Any insight would be greatly appreciated. Also, I can post more details upon request if this isn't thorough enough.

    Read the article

  • Send files ending in .mp4 in Apache with HTTP 206 Partial Content

    - by Pacha
    I am using Apache as web server and the return code is always HTTP/1.1 200. I want to set some kind of handler or use a mod to return HTTP/1.1 206 when the extension of the file requested is .mp4 so it can do video seeking, my web server is already returning some headers to do seeking, but it doesn't work. Is this possible? The HTTP headers http://*hidden*/media/movies/file/1080/d3191cd83109c593ec908f3a47efa8a2.mp4 GET /media/movies/file/1080/d3191cd83109c593ec908f3a47efa8a2.mp4 HTTP/1.1 Host: *hidden* User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:31.0) Gecko/20100101 Firefox/31.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-US,en;q=0.5 Accept-Encoding: gzip, deflate Referer: http://vjs.zencdn.net/4.6/video-js.swf Cookie: csrftoken=zXngwwS1S827g7aAJYbHJS3ajn5BGq9M; sessionid=uj1hlj00c85aoehw0n5fye8waggb7uod Connection: keep-alive HTTP/1.1 200 OK Date: Thu, 21 Aug 2014 15:04:46 GMT Server: Apache/2.2.22 (Debian) X-Mod-H264-Streaming: version=2.2.7 Content-Length: 2148905782 Last-Modified: Wed, 13 Aug 2014 11:36:46 GMT Etag: "8e002a-8015b345-5008133ff23c4;-2146061514" Accept-Ranges: bytes Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: video/mp4

    Read the article

  • Toshiba eStudio 281c turn off snmp in driver.

    - by Matt
    I have a customer with an eStudio 281c colour copier/printer and I'm having an issue with terminal services. Basically, the driver has the ability to query the printer for its configuration over snmp. I've set it to manual rather than auto so it's effectively switched off. Trouble is, when logging into terminal services it's still trying to communicate with the printer. This is an issue because when it fails it then defaults to being a different printer model with a different tray configuration. Users can still print but the settings are wrong. Any ideas?

    Read the article

  • mount another drive to the same directory

    - by Ken Autotron
    I recently purchased a server that was advertised as 2TB (2 1TB drives) in size, when I use it it reports only one of the drives, I would like to be able to use both as if one drive. here is the specs... sudo lshw -C disk *-disk description: ATA Disk product: TOSHIBA DT01ACA1 vendor: Toshiba physical id: 0.0.0 bus info: scsi@1:0.0.0 logical name: /dev/sda version: MS2O serial: 13EJ81XPS size: 931GiB (1TB) capabilities: partitioned partitioned:dos configuration: ansiversion=5 signature=0005b3dd *-disk description: ATA Disk product: TOSHIBA DT01ACA1 vendor: Toshiba physical id: 0.0.0 bus info: scsi@4:0.0.0 logical name: /dev/sdb version: MS2O serial: 13OX3TKPS size: 931GiB (1TB) capabilities: partitioned partitioned:dos configuration: ansiversion=5 signature=00030e86 and fdisk -l Disk /dev/sdb: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 4096 bytes I/O size (minimum/optimal): 4096 bytes / 4096 bytes Disk identifier: 0x00030e86 Device Boot Start End Blocks Id System /dev/sdb1 * 4096 41947135 20971520 fd Linux raid autodetect /dev/sdb2 41947136 1952468991 955260928 fd Linux raid autodetect /dev/sdb3 1952468992 1953519615 525312 82 Linux swap / Solaris Disk /dev/sda: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 4096 bytes I/O size (minimum/optimal): 4096 bytes / 4096 bytes Disk identifier: 0x0005b3dd Device Boot Start End Blocks Id System /dev/sda1 * 4096 41947135 20971520 fd Linux raid autodetect /dev/sda2 41947136 1952468991 955260928 fd Linux raid autodetect /dev/sda3 1952468992 1953519615 525312 82 Linux swap / Solaris Disk /dev/md2: 978.2 GB, 978187124736 bytes 2 heads, 4 sectors/track, 238815216 cylinders, total 1910521728 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 4096 bytes I/O size (minimum/optimal): 4096 bytes / 4096 bytes Disk identifier: 0x00000000 Disk /dev/md2 doesn't contain a valid partition table Disk /dev/md1: 21.5 GB, 21474770944 bytes 2 heads, 4 sectors/track, 5242864 cylinders, total 41942912 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 4096 bytes I/O size (minimum/optimal): 4096 bytes / 4096 bytes Disk identifier: 0x00000000 Disk /dev/md1 doesn't contain a valid partition table is it possible to mount both drives to say /Home/ so I would have 2TB of usable space?

    Read the article

  • Connecting the Dots (.NET Business Connector)

    - by ssmantha
    Recently, one of my colleagues was experimenting with Reporting Server on DAX 2009, whenever he used to view a report in SQL Server Reporting Manager he was welcomed with an error: “Error during processing Ax_CompanyName report parameter. (rsReportParameterProcessingError)” The Event Log had the following entry: Dynamics Adapter LogonAs failed. Microsoft.Dynamics.Framework.BusinessConnector.Session.Exceptions.FatalSessionException at Microsoft.Dynamics.Framework.BusinessConnector.Session.DynamicsSession.HandleException(Stringmessage, Exception exception, HandleExceptionCallback callback) We later found out that this was due to incorrect Business Connector account, with my past experience I noticed this as a very common mistake people make during EP and Reporting Installations. Remember that the reports need to connect to the Dynamics Ax server to run the AxQueries., which needs to pass through the .NET Business Connector. To ensure everything works fine please note the following settings: 1) Your Report Server Service Account should be same as .NET Business Connector proxy account. 2) Ensure on the server which has Reporting Services installed, the client configuration utility for Business Connector points to correct proxy account. 3) And finally, the AX instance you are connecting to has Service account specified for .NET business connector. (administration –> Service accounts –> .NET Business Connector) These simple checkpoints can help in almost most of the Business Connector related  errors, which I believe is mostly due to incorrect configuration settings. Happy DAXing!!

    Read the article

  • How can I get the Creative Zen Touch to work?

    - by Hello71
    I've tried using gnomad2 (too lazy to configure all the dependencies) and Amarok (segfaulted when I tried to do anything with it). $ lsusb Bus 002 Device 004: ID 041e:4131 Creative Technology, Ltd Zen Touch (mtp) # SNIP OUTPUT # $ lsusb --verbose -s 002:004 Bus 002 Device 004: ID 041e:4131 Creative Technology, Ltd Zen Touch (mtp) Device Descriptor: bLength 18 bDescriptorType 1 bcdUSB 2.00 bDeviceClass 255 Vendor Specific Class bDeviceSubClass 0 bDeviceProtocol 0 bMaxPacketSize0 64 idVendor 0x041e Creative Technology, Ltd idProduct 0x4131 Zen Touch (mtp) bcdDevice 1.00 iManufacturer 1 Creative Technology Ltd iProduct 2 Creative Zen Touch iSerial 3 010125517D039098 bNumConfigurations 1 Configuration Descriptor: bLength 9 bDescriptorType 2 wTotalLength 39 bNumInterfaces 1 bConfigurationValue 1 iConfiguration 16 Configuration 1 bmAttributes 0xc0 Self Powered MaxPower 440mA Interface Descriptor: bLength 9 bDescriptorType 4 bInterfaceNumber 0 bAlternateSetting 0 bNumEndpoints 3 bInterfaceClass 0 (Defined at Interface level) bInterfaceSubClass 0 bInterfaceProtocol 0 iInterface 33 MTP Interface Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x01 EP 1 OUT bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x82 EP 2 IN bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x83 EP 3 IN bmAttributes 3 Transfer Type Interrupt Synch Type None Usage Type Data wMaxPacketSize 0x0008 1x 8 bytes bInterval 4 Device Qualifier (for other device speed): bLength 10 bDescriptorType 6 bcdUSB 2.00 bDeviceClass 255 Vendor Specific Class bDeviceSubClass 0 bDeviceProtocol 0 bMaxPacketSize0 64 bNumConfigurations 1 can't get debug descriptor: Connection timed out Device Status: 0x0000 (Bus Powered)

    Read the article

  • Location of truetype fonts

    - by StackedCrooked
    I would like to create a small script that installs a few truetype fonts on the user's system. On my Ubuntu machine the truetype fonts are located at /usr/share/fonts/truetype. However, I'm not sure if this location is the same on all machines. Is there a way to find out where truetypes fonts are stored on any Linux system? Update After some research I found that the path usr/share/fonts/truetype is specified in the XML file /etc/fonts/fonts.conf. It's an XML file, so I can use XPath to get the dir: xpath -q -e 'fontconfig/dir[1]/text()[1]' /etc/fonts/fonts.conf I don't know however if this file will exist on all (or most) Linux systems.

    Read the article

  • Cannot uninstall myspell-he

    - by Yuval Rabinovich
    I tried to install a Hebrew spell checker for LibreOffice. I downloaded a package named myspell-he_1.1-1_all.deb and tried to install it through the software center, installation failed and I can neither complete installation nor remove it. Since then, whenever I run Update Manager I get an error message: The package system is broken, In details: The following packages have unmet dependencies: dictionaries-common: When I try to run the Software center I get a pop-up window: Items cannot be installed or removed until the package catalog is repaired. A Repair option is suggested. If I click it the following nessage appears: installArchives() failed: dpkg: dependency problems prevent configuration of myspell-he: dictionaries-common (1.12.1ubuntu2) breaks myspell-he (<= 1.1-1) and is installed. Version of myspell-he to be configured is 1.1-1. dpkg: error processing myspell-he (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: myspell-he Error in function: dpkg: dependency problems prevent configuration of myspell-he: dictionaries-common (1.12.1ubuntu2) breaks myspell-he (<= 1.1-1) and is installed. Version of myspell-he to be configured is 1.1-1. dpkg: error processing myspell-he (--configure): dependency problems - leaving unconfigured How can I clear this mess?

    Read the article

  • Use 3 monitors w/built-in intel adapter + two old nvidia PCI cards on 10.10?

    - by Kendall Gifford
    I'd like to move from windows with my current workstation. The only thing holding me back is that I have 3 monitors connected to the system and I really take advantage of the real estate when working. I just installed Ubuntu 10.10 on the system and one of the monitors is up and running just fine. This monitor is connected to the built-in Intel adapter. I also have two old nVidia GeForce4 MX 4000 (nv19pl) cards in my two PCI slots with two monitors connected to them respectively. I installed the legacy (and proprietary) nVidia drivers (the nvidia-96 package) that claims to support these old cards. Now the question is how to get X configured to use all adapters (using two different drivers) so I can use all three monitors (and is this even possible)? From what I've read, it looks like I'll have to write an xorg.conf file since the nVidia driver doesn't support the auto-magic configuration supported by other drivers. On this site: http://wiki.ubuntu.com/X/Config it says that on 10.10 I just need to write an xorg.conf "containing only those sections and options that you need to override Xorg's autoconfigurated settings". So, does this mean I can get away with only including the nVidia-specific configuration stuff and all else will get auto-configured? Or, will providing a config with a "Device" section overrule the auto-magic from detecting/using the Intel adapter? I ran the included nvidia-xconfig to generate a basic, nVidia-specific xorg.conf but I'm hesitant to reboot with it in place, suspecting I'll have a screwed up display. Also, is there any way (any tool or command) to generate an xorg.conf from the current, auto-configured running state of an X session? If I have to write a full, complete config, I'd rather start with one that includes everything that's been auto-detected thus far (and merge it with my nVidia version). Anyhow, any info and thoughts are greatly appreciated (as are answers).

    Read the article

< Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >