Search Results

Search found 391 results on 16 pages for 'configs'.

Page 8/16 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Ubuntu 12.04LTS mountall: Disconnected from Plymouth

    - by user169954
    I have ubuntu 12.04LTS 64bit running on an i5 dual core 8G RAM. On startup I get the message mountall: Disconnected from Plymouth [OK] And the system looks stuck. However, if I go to tty1, then I can login and startx and everything seems to be fine except for being a bit sluggish. I can verify that my nfs mounts are ok, and that my swap is ok. Every time I reboot the system there is a _gdm_gdm_crash file in my /var/crash, which makes me think my problem is rooted in gdm, X configs and/or nvidia drivers. A bit of background in case it's relevant: 3 hours ago my desktop crashed. Following various 'tips' on the web I made a complete mess of my X server and X configuration files, and at one point I even had to recreate my swap partition. Anyway, after much struggle I managed to get to the state I mentioned above: I have a working system provided I always login through tty1. What is this Plymouth anyway? Would it make a difference if I used gnome-wm instead of gdm or lightdm? (I mean to the startup, not to me :-) What bit of config do I change to tell startx to use gnome-wm not gdm or ligthdm? Thank you in advance

    Read the article

  • Module configuration and layout configuration in zend framework.

    - by Prasanth P
    Hi all, I got some codes from other articles for configuring module and layout in zend framework. I tried with in my local. i didn't get different layout for default and admin module. Here is my code for configuring module and layout for zend framework. configs/application.ini [production] # Debug output phpSettings.display_startup_errors = 0 phpSettings.display_errors = 0 # Include path includePaths.library = APPLICATION_PATH "/../library" # Bootstrap bootstrap.path = APPLICATION_PATH "/Bootstrap.php" bootstrap.class = "Bootstrap" admin.bootstrap.path = APPLICATION_PATH "/modules/admin/Bootstrap.php" admin.bootstrap.class = "admin_Bootstrap" # Front Controller resources.frontController.controllerDirectory = APPLICATION_PATH "/controllers" resources.frontController.env = APPLICATION_ENV # Session resources.session.name = "ZendSession" resources.session.save_path = APPLICATION_PATH "/../data/session" resources.session.remember_me_seconds = 86400 # Layout resources.layout.layout = "layout" resources.layout.layoutPath = APPLICATION_PATH "/layouts" admin.resources.layout.layout = "admin" admin.resources.layout.layoutPath = APPLICATION_PATH "/modules/admin/layouts" # Views resources.view.encoding = "UTF-8" resources.view.basePath = APPLICATION_PATH "/views/" resources.view[] = resources.frontController.moduleDirectory = APPLICATION_PATH "/modules" resources.modules[] = resources.view[] = admin.resources.view[] = [staging : production] [testing : production] phpSettings.display_startup_errors = 1 phpSettings.display_errors = 1 [development : production] phpSettings.display_startup_errors = 1 phpSettings.display_errors = 1 application/Bootstrap.php <?php /** * Ensure all communications are managed by sessions. */ require_once ('Zend/Session.php'); Zend_Session::start(); class Bootstrap extends Zend_Application_Bootstrap_Bootstrap { protected function _initDoctype() { $this->bootstrap( 'view' ); $view = $this->getResource( 'view' ); $view->navigation = array(); $view->subnavigation = array(); $view->headTitle( 'Module One' ); $view->headLink()->appendStylesheet('/css/clear.css'); $view->headLink()->appendStylesheet('/css/main.css'); $view->headScript()->appendFile('/js/jquery.js'); $view->doctype( 'XHTML1_STRICT' ); //$view->navigation = $this->buildMenu(); } /*protected function _initAppAutoLoad() { $autoloader = new Zend_Application_Module_Autoloader(array( 'namespace' => 'default', 'basePath' => APPLICATION_PATH )); return $autoloader; }*/ protected function _initLayoutHelper() { $this->bootstrap('frontController'); $layout = Zend_Controller_Action_HelperBroker::addHelper( new ModuleLayoutLoader()); } public function _initControllers() { $front = Zend_Controller_Front::getInstance(); $front->addModuleDirectory(APPLICATION_PATH . '/modules/admin/', 'admin'); } protected function _initAutoLoadModuleAdmin() { $autoloader = new Zend_Application_module_Autoloader(array( 'namespace' => 'Admin', 'basePath' => APPLICATION_PATH . '/modules/admin' )); return $autoloader; } protected function _initModuleutoload() { $autoloader = new Zend_Application_Module_Autoloader ( array ('namespace' => '', 'basePath' => APPLICATION_PATH ) ); return $autoloader; } } class ModuleLayoutLoader extends Zend_Controller_Action_Helper_Abstract // looks up layout by module in application.ini { public function preDispatch() { $bootstrap = $this->getActionController() ->getInvokeArg('bootstrap'); $config = $bootstrap->getOptions(); echo $module = $this->getRequest()->getModuleName(); /*echo "Configs : <pre>"; print_r($config[$module]);*/ if (isset($config[$module]['resources']['layout']['layout'])) { $layoutScript = $config[$module]['resources']['layout']['layout']; $this->getActionController() ->getHelper('layout') ->setLayout($layoutScript); } } } application/modules/admin/Bootstrap.php <?php class Admin_Bootstrap extends Zend_Application_Module_Bootstrap { /*protected function _initAppAutoload() { $autoloader = new Zend_Application_Module_Autoloader(array( 'namespace' => 'admin', 'basePath' => APPLICATION_PATH . '/modules/admin/' )); return $autoloader; }*/ protected function _initDoctype() { $this->bootstrap( 'view' ); $view = $this->getResource( 'view' ); $view->navigation = array(); $view->subnavigation = array(); $view->headTitle( 'Module One' ); $view->headLink()->appendStylesheet('/css/clear.css'); $view->headLink()->appendStylesheet('/css/main.css'); $view->headScript()->appendFile('/js/jquery.js'); $view->doctype( 'XHTML1_STRICT' ); //$view->navigation = $this->buildMenu(); } } Please go through it and let me know any knows how do configure module and layout in right way.. Thanks and regards, Prasanth P

    Read the article

  • How to push changes from Test server to Live server?

    - by anonymous
    As a beginner, I finally noticed the issue with making changes to the live server I've been working on, now that I have a couple users on it, since I bring it down so often. I created an EC2 image of my live server and set up a separate instance on EC2, so now I have 2 EC2 instances, Stage and Production. I set up GitHub and push changes to stage and test my code there, and when it's all done and working, I push it to the production branch, and everything is good. And there is a slight issue here since I name my files config_stage.js and config_production.js and set up .gitignore on each server, and in my code, I would have it read the ENV flags and set up the appropriate configs, is this the correct approach? And my main question is: how do you keep track of non-code changes to the server? For example, I installed HAProxy, Stunnel, Redis, MongoDB and several other things onto the Stage server for testing and now that it's all working and good, how do I deploy them to production? Right now, I'm just keeping track of everything I installed and copying configuration files over, which is very tedious and I'm afraid I may have missed a step somewhere. Is there a better way to port these changes over from my test server to my live server?

    Read the article

  • API Auth vs User Auth

    - by user1626384
    I have read many posts and articles on this topic but still cant connect the dots. I want to make a Rails app that is strictly a JSON API maybe using Sinatra or the rails-api gem. I also want to make both a web client app and an iPhone app which consumes the API. No plans on letting third party dev's use it. So I could create a separate username/password combination for both the web and mobile client and use HTTP Basic over SSL. Each app would have these values as configs in the source and use it to authenticate to the API so only these can make a call. Anyone else trying would get a 401 error returned. This would be considered handling the API authentication. The web and mobile client apps allow end users to sign up and read/write data to the API. When each user is created, I create and save a token in their profile. If a user successfully signs in, I send back the token. On each future read/write then also send along this token in the header. I get the token and lookup the user in the database and make the read/write. Does this sound like an appropriate way to handle it. For the web client, when I initially send back the token, where do I store it. In a cookie? Do I also drop a cookie to handle session state?

    Read the article

  • Fixing up Configurations in BizTalk Solution Files

    - by Elton Stoneman
    Just a quick one this, but useful for mature BizTalk solutions, where over time the configuration settings can get confused, meaning Debug configurations building in Release mode, or Deployment configurations building in Development mode. That can cause issues in the build which aren't obvious, so it's good to fix up the configurations. It's time-consuming in VS or in a text editor, so this bit of PowerShell may come in useful - just substitute your own solution path in the $path variable: $path = 'C:\x\y\z\x.y.z.Integration.sln' $backupPath = [System.String]::Format('{0}.bak', $path) [System.IO.File]::Copy($path, $backupPath, $True) $sln = [System.IO.File]::ReadAllText($path)   $sln = $sln.Replace('.Debug|.NET.Build.0 = Deployment|.NET', '.Debug|.NET.Build.0 = Development|.NET') $sln = $sln.Replace('.Debug|.NET.Deploy.0 = Deployment|.NET', '.Debug|.NET.Deploy.0 = Development|.NET') $sln = $sln.Replace('.Debug|Any CPU.ActiveCfg = Deployment|.NET', '.Debug|Any CPU.ActiveCfg = Development|.NET') $sln = $sln.Replace('.Deployment|.NET.ActiveCfg = Debug|Any CPU', '.Deployment|.NET.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Any CPU.ActiveCfg = Debug|Any CPU', '.Deployment|Any CPU.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Any CPU.Build.0 = Debug|Any CPU', '.Deployment|Any CPU.Build.0 = Release|Any CPU') $sln = $sln.Replace('.Deployment|Mixed Platforms.ActiveCfg = Debug|Any CPU', '.Deployment|Mixed Platforms.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Mixed Platforms.Build.0 = Debug|Any CPU', '.Deployment|Mixed Platforms.Build.0 = Release|Any CPU') $sln = $sln.Replace('.Deployment|.NET.ActiveCfg = Debug|Any CPU', '.Deployment|.NET.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Debug|.NET.ActiveCfg = Deployment|.NET', '.Debug|.NET.ActiveCfg = Development|.NET')   [System.IO.File]::WriteAllText($path, $sln) The script creates a backup of the solution file first, and then fixes up all the configs to use the correct builds. It's a simple search and replace list, so if there are any patterns that need to be added let me know and I'll update the script. A RegEx replace would be neater, but when it comes to hacking solution files, I prefer the conservative approach of knowing exactly what you're changing.

    Read the article

  • vsftpd: chroot_local_user causes GNU/TLS-error

    - by akrosikam
    Distro: Ubuntu 12.04.2 Server 32-bit Server client: vsftpd 2.3.5 (from default "main" repository) Problem: Since upgrading from Ubuntu 10.04 to Ubuntu 12.04 (nothing changed on client-side), vsftp has refused to make chroot-jails with the "chroot_local_user" directive on FTP(e/i)S-connections. Here's my vsftpd.conf: anonymous_enable=NO local_enable=YES write_enable=YES local_umask=022 dirmessage_enable=YES xferlog_enable=YES xferlog_std_format=YES ftpd_banner=How are you gentlemen. listen=YES pam_service_name=vsftpd userlist_enable=YES userlist_deny=NO tcp_wrappers=YES connect_from_port_20=YES ftp_data_port=20 listen_port=21 pasv_enable=YES pasv_promiscuous=NO pasv_min_port=4242 pasv_max_port=4252 pasv_addr_resolve=YES pasv_address=your.domain.com ssl_enable=YES allow_anon_ssl=NO force_local_logins_ssl=YES force_local_data_ssl=YES ssl_tlsv1=YES ssl_sslv2=NO ssl_sslv3=NO rsa_cert_file=/home/maw/ssl_ftp_test/vsftpd.pem rsa_private_key_file=/home/maw/ssl_ftp_test/vsftpd.pem debug_ssl=YES log_ftp_protocol=YES ssl_ciphers=HIGH chroot_local_user=NO How to reproduce: Have a working SSL/TLS-secured vsftpd-configuration (I suggest similar to the one above) ready. Try to connect with an FTP user client and upload some files. With my setup, the above listed config works well at this point. Edit /etc/vsftpd.conf and set chroot_local_user= to YES. Make sure that chroot_list_enable= and/or chroot_list_file= are not set. Comment them out if they are. Save and exit. Run sudo restart vsftpd (or sudo service vsftpd restart if you like) in a terminal. Try to connect with an FTP user client. You should see a message more or less like this: GnuTLS error -15: An unexpected TLS packet was received. This is an issue for me, as I do not want FTP-sessions to be able to list files outside the user's home folder. I have checked with several client-side apps, and I get the same results with every one of them. Filezilla is not so good regarding cipher methods nowadays, but as I am able to make an FTP(e)s-connection over TLS (as long as chroot'ing is disabled and ssl_ciphers is set to HIGH) I have a feeling ciphers are not the issue this time, and that I won't find the answer by tweaking configs on the client side. My vsftpd.log stays empty, even though debug_ssl and log_ftp_protocol are enabled, so no info there either.

    Read the article

  • How to Run Pam Face Authentication

    - by Supriyo Banerjee
    I am using Ubuntu 11.10. I went to the following URL to download the software 'Pam Face Authentication': http://ppa.launchpad.net/antonio.chiurazzi/ppa/ubuntu/pool/main/p/pam-face-authentication/ and downloaded the version for natty narhwall. I installed the software using the following commands: sudo apt-get install build-essential cmake qt4-qmake libx11-dev libcv-dev libcvaux-dev libhighgui2.1 libhighgui-dev libqt4-dev libpam0g-dev checkinstall cd /tmp && wget http://pam-face-authentication.googlecode.com /files/pam-face-authentication-0.3.tar.gz sudo add-apt-repository ppa:antonio.chiurazzi sudo apt-get update sudo apt-get install pam-face-authentication cat << EOF | sudo tee /usr/share/pam-configs/face_authentication /dev/null Name: face_authentication profile Default: yes Priority: 900 Auth-Type: Primary Auth: [success=end default=ignore] pam_face_authentication.so enableX EOF sudo pam-auth-update --package face_authentication The software installed and I can run the qt-facetrainer. But the problem is when I restarted my system, I saw that the default login screen is appearing where I should put my password to login. The webcam is not starting at all. And I cannot login with my face. Which means I think that pam face authentication programme is not starting at all. Please let me know how I can login with my face using pam face authentication programme.

    Read the article

  • How do I enable sound with the "linux-virtual" kernel?

    - by Ola Tuvesson
    I've been trying to enable sound for the linux-virtual kernel as I want to run an ultra slim Ubuntu server under VirtualBox but need audio. The resource usage difference between virtual and generic/server is surprisingly large, with the virtual kernel system using 80Mb less RAM after a clean boot (130Mb vs 210Mb), and I really want to squeeze every clock cycle and available byte I can out of the system. Besides, the virtual kernel has some additional optimisations enabled specifically for virtual machines (or so I am told). Now I have compiled my own kernel a few times in the past, for example to include the Intel-PHC module (for improved power management on Thinkpads), so the concept is not entirely alien to me, but I've run into a strange problem which I'm hoping someone can help explain: When I do a diff between the config files for Linux-generic and Linux-virtual there are precious few differences, and certainly none which pertain to sound support; there are really only five or six lines which differ, and they're mainly to do with i/o timing, sleep state and priorities. What gives? I expected the differences to be extensive, and that I would be able to identify the options that enabled audio by looking at them, but my problem doesn't seem to be related to the config file at all (yes, I know about the sound drivers section - it is identical between the two kernel configs). Am I looking in the wrong place? Many thanks!

    Read the article

  • Gnome-Network-Manager Config File Migration

    - by Jorge
    I think I have an issue with gnome-network-manager, I used to have a lot of connections configured, Wireless, Wired and VPN. After upgrading to 12.04 (from 11.10) I lost every configuration. I realized that the configs that used to be saved in $HOME/.gconf/system/networking/connections now are being saved in /etc/NetworkManager/system-connections/. I don't know how to migrate my settings to the new config file format Can anybody help me? jorge@thinky:~$ sudo lshw -C network *-network description: Ethernet interface product: 82566MM Gigabit Network Connection vendor: Intel Corporation physical id: 19 bus info: pci@0000:00:19.0 logical name: eth0 version: 03 serial: 00:1f:e2:14:5a:9b capacity: 1Gbit/s width: 32 bits clock: 33MHz capabilities: pm msi bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=e1000e driverversion=1.5.1-k firmware=0.3-0 latency=0 link=no multicast=yes port=twisted pair resources: irq:46 memory:fe000000-fe01ffff memory:fe025000-fe025fff ioport:1840(size=32) *-network description: Wireless interface product: PRO/Wireless 4965 AG or AGN [Kedron] Network Connection vendor: Intel Corporation physical id: 0 bus info: pci@0000:03:00.0 logical name: wlan0 version: 61 serial: 00:21:5c:32:c2:e5 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=iwl4965 driverversion=3.2.0-23-generic-pae firmware=228.61.2.24 ip=192.168.2.103 latency=0 link=yes multicast=yes wireless=IEEE 802.11abgn resources: irq:47 memory:df3fe000-df3fffff jorge@thinky:~$ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04 LTS Release: 12.04 Codename: precise jorge@thinky:~$ uname -a Linux thinky 3.2.0-23-generic-pae #36-Ubuntu SMP Tue Apr 10 22:19:09 UTC 2012 i686 i686 i386 GNU/Linux jorge@thinky:~$ dpkg -l | grep -i firm ii linux-firmware 1.79 Firmware for Linux kernel drivers

    Read the article

  • Presario r3000 hangs on boot to Ubuntu 12.04

    - by Chris
    I have a Compaq Presario r3000 laptop with Ubuntu 12.04 32 bit. 1.2GB RAM, with a nvidia Gforce 420 Go with 32MB RAM. It hangs on bootup about 3 out of 4 times at either the purple screen, or with a cursor in the upper left corner. Booting without a splash screen shows it hanging after attempting to initialize the floppy controller. Unity has never booted to 3D either, only 2D, with the additional drivers always failing to install. I searched all over this board and found similar issues, and tried everything there. Extensive vid driver install and uninstalls. Network issues that I've since resolved. I even found a very similar topic with a fix that didn't work. I'll probably try a clean install today (given how much I've probably messed up), and I would really appreciate any ideas. I'm fairly new to Ubuntu, so if you would like to see any error logs or configs, please walk me through exactly how to get and post them here. Thanks.

    Read the article

  • What resources are there for creating a dedicated NES emulator box?

    - by normalocity
    Where do I start, and what communities should I get involved in, in order to achieve the following? Ideally, I'd like to have a box that does the following (doesn't have to do this out of the box, I'm just looking to be able to achieve these goals through configs and necessary dependencies): Either bypasses login, or auto login Auto-start FCEUX with options that will (a) automatically start a ROM of my choosing, and (b) go into full-screen mode. You can assume that before I get that far, I've already configured the input devices and video options. I'd like to create (or install, if it exists) a full-screen app that takes a list of ROMs, allows me to select one with a gamepad/arcade stick, and press a button to open that game Be able to map a button on a gamepad/arcade stick to the "Power off" or exit function of the emulator, such that it will take me back to the ROM selection screen. I've already successfully installed FCEUX and tested it with an arcade stick I own, so I'm not looking for an emulator installer guide. I don't know if the ROM selector app exists already, but I'm a Java developer, and could probably create one (so long as it's not too difficult to support controllers - I was thinking of using Slick2D for this - a gaming library that I'm already pretty familiar with). The goal would be a dedicated box that I have connected to my TV. I power it on. It boots up and starts the ROM selection app, which passes the proper parameters to FCEUX (or another emulator that I might switch to at a later time), and I'm ready to go. Basically an NES emulator as a real, living room console. Also, as far as mapping a controller button to functions in the app, well, I've also played around with hardware, and it would be pretty trivial for me to modify a gamepad to trigger key presses. I just don't want to go to that length if it's not necessary.

    Read the article

  • create a .deb Package from scripts or binaries

    - by tdeutsch
    I searched for a simple way to create .deb Packages for things which have no source code to compile (configs, shellscripts, proprietary software). This was quite a problem because most of the package tutorials are assuming you have a source tarball you want to compile. Then I've found this short tutorial (german). Afterwards, I created a small script to create a simple repository. Like this: rm /export/my-repository/repository/* cd /home/tdeutsch/deb-pkg for i in $(ls | grep my); do dpkg -b ./$i /export/my-repository/repository/$i.deb; done cd /export/avanon-repository/repository gpg --armor --export "My Package Signing Key" > PublicKey apt-ftparchive packages ./ | gzip > Packages.gz apt-ftparchive packages ./ > Packages apt-ftparchive release ./ > /tmp/Release.tmp; mv /tmp/Release.tmp Release gpg --output Release.gpg -ba Release I added the key to the apt keyring and included the source like this: deb http://my.default.com/my-repository/ ./ It looks like the repo itself is working well (I ran into some problems, to fix them I needed to add the Packages twice and make the temp-file workaround for the Release file). I also put some downloaded .deb into the repo, it looks like they are also working without problems. But my self created packages didn't... Wenn i do sudo apt-get update, they are causing errors like this: E: Problem parsing dependency Depends E: Error occurred while processing my-printerconf (NewVersion2) E: Problem with MergeList /var/lib/apt/lists/my.default.com_my-repository_._Packages E: The package lists or status file could not be parsed or opened. Has anyone an idea what I did wrong?

    Read the article

  • How to configure KDE default settings for a new user of a group?

    - by Adobe
    I'm a sys admin on Kubuntu 11.10 machine. Where do I configure the basic config for a new user (say belonging to group "users")? Edit 1: I want to configure langauages - currently my new users get English and Bulgarian Languages. I want them to get English and Russian - and also to set Alt-CapsLock - to be the input-language-switching-combination. Edit 2: How do I configure things in /usr/share/kde4 When I do kdesudo systemsettings and save configurations - only root settings got changed - not the /usr/share/kde4 ones. Edit 3: New user gets the /etc/skel files controlling bash behaviour-appearence. What about the KDE new user's default files - where are they stored? Edit 4: Oh, I found some hints: kde4-config --path config gives a list of folders (separated by the colon) where KDE looks for configs. My machine responded with: /home/boris/.kde/share/config/ /etc/kde4/ /usr/share/kubuntu-default-settings/kde4-profile/default/share/config/ /usr/share/kde4/config/ /usr/share/desktop-base/profiles/kde-profile/share/config/ It looks like third line is where KDE takes the default options. So I found these zilions of settings - but no GUI way to configure it ((. Edit 5: Finally, I've created a dummy user, configured it, and wrote a script which gives it's settings to a given user(s). The trick - is to chown after one transfered the dot files from one user to another. I've tested it - it works fine.

    Read the article

  • How can I play 24Bit 96000Hz vinyl rips?

    - by Peter
    I am getting music through the Nvidia HDMI but it is all down-sampled to 44000Hz. I have spent at least 3 hours searching for an answer yesterday. I use Pulse Volume and I even changed the default settings in the Pulse folder to 96000 but it does not work. Though 5.1 sound works perfectly. They all work well in Windows 7, but I prefer to use Ubuntu. All the config files for all the programs are in the etc folder, then look in the pulse folder, and there are 3 three files, 2 of these I changed the permissions and I changed 16 and 44000 to 24 and 96000 as I had read in another forum, but it worked for them and not for me. Though 5.1 works as I have special mp4 file that can check each channel. I did alot of restarts and checking after these changes as nothing had happened, my amp tells me the freq of the signal, which does work which I have tested on Windows 7. I have also gone to systems to download the latest driver for mu Nvidia card. But even if it does not work, I guess I can always play Cd's. But I think my player have become more unstable since I changed the configs, so I may uninstall and install Pulse Volume Control again.

    Read the article

  • Can I upgrade my ubuntu version and change to be primary OS after originally installing with Wibu?

    - by Garrick Wann
    I have recently installed Ubuntu 12.04 using Wubi 12.04 and I now wish to upgrade to a full installation of Ubuntu 14.04, Before attempting to upgrade through the update center I did some research on upgrading from a Wubi installation (alongside windows) to a full installation making Ubuntu primary and only OS and found that it is in fact doable through the update center however it is just highly recommended to perform a full backup before doing so. I have now finished backing up all the data I need to worry about and began the upgrade process through the update center and received the following error: Your graphics hardware may not be fully supported in Ubuntu 14.04. Running the 'unity' desktop environment is not fully supported by your graphics hardware. You will maybe end up in a very slow environment after the upgrade. Our advice is to keep the LTS version for now. For more information see https://wiki.ubuntu.com/X/Bugs/UpdateManagerWarningForUnity3D Do you still want to continue with the upgrade? My questions are as follows: A. Isnt 14.04 a LTS version??? B, What are your recomendations in order to ensure my graphics driver is installed correctly and im not stuck with bad configs/install?

    Read the article

  • Rails + Nginx + Unicorn multiple apps

    - by Mikhail Nikalyukin
    I get the server where is currently installed two apps and i need to add another one, here is my configs. nginx.conf user www-data www-data; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Disable unknown domains ## server { listen 80 default; server_name _; return 444; } ## # Virtual Host Configs ## include /home/ruby/apps/*/shared/config/nginx.conf; } unicorn.rb deploy_to = "/home/ruby/apps/staging.domain.com" rails_root = "#{deploy_to}/current" pid_file = "#{deploy_to}/shared/pids/unicorn.pid" socket_file= "#{deploy_to}/shared/sockets/.sock" log_file = "#{rails_root}/log/unicorn.log" err_log = "#{rails_root}/log/unicorn_error.log" old_pid = pid_file + '.oldbin' timeout 30 worker_processes 10 # ????? ???? ? ??????????? ?? ????????, ???????? ??????? ? ??????? ???? ???? listen socket_file, :backlog => 1024 pid pid_file stderr_path err_log stdout_path log_file preload_app true GC.copy_on_write_friendly = true if GC.respond_to?(:copy_on_write_friendly=) before_exec do |server| ENV["BUNDLE_GEMFILE"] = "#{rails_root}/Gemfile" end before_fork do |server, worker| defined?(ActiveRecord::Base) and ActiveRecord::Base.connection.disconnect! if File.exists?(old_pid) && server.pid != old_pid begin Process.kill("QUIT", File.read(old_pid).to_i) rescue Errno::ENOENT, Errno::ESRCH end end end after_fork do |server, worker| defined?(ActiveRecord::Base) and ActiveRecord::Base.establish_connection end Also im added capistrano to the project deploy.rb # encoding: utf-8 require 'capistrano/ext/multistage' require 'rvm/capistrano' require 'bundler/capistrano' set :stages, %w(staging production) set :default_stage, "staging" default_run_options[:pty] = true ssh_options[:paranoid] = false ssh_options[:forward_agent] = true set :scm, "git" set :user, "ruby" set :runner, "ruby" set :use_sudo, false set :deploy_via, :remote_cache set :rvm_ruby_string, '1.9.2' # Create uploads directory and link task :configure, :roles => :app do run "cp #{shared_path}/config/database.yml #{release_path}/config/database.yml" # run "ln -s #{shared_path}/db/sphinx #{release_path}/db/sphinx" # run "ln -s #{shared_path}/config/unicorn.rb #{release_path}/config/unicorn.rb" end namespace :deploy do task :restart do run "if [ -f #{unicorn_pid} ] && [ -e /proc/$(cat #{unicorn_pid}) ]; then kill -s USR2 `cat #{unicorn_pid}`; else cd #{deploy_to}/current && bundle exec unicorn_rails -c #{unicorn_conf} -E #{rails_env} -D; fi" end task :start do run "cd #{deploy_to}/current && bundle exec unicorn_rails -c #{unicorn_conf} -E #{rails_env} -D" end task :stop do run "if [ -f #{unicorn_pid} ] && [ -e /proc/$(cat #{unicorn_pid}) ]; then kill -QUIT `cat #{unicorn_pid}`; fi" end end before 'deploy:finalize_update', 'configure' after "deploy:update", "deploy:migrate", "deploy:cleanup" require './config/boot' nginx.conf in app shared path upstream staging_whotracker { server unix:/home/ruby/apps/staging.whotracker.com/shared/sockets/.sock; } server { listen 209.105.242.45; server_name beta.whotracker.com; rewrite ^/(.*) http://www.beta.whotracker.com/$1 permanent; } server { listen 209.105.242.45; server_name www.beta.hotracker.com; root /home/ruby/apps/staging.whotracker.com/current/public; location ~ ^/sitemaps/ { root /home/ruby/apps/staging.whotracker.com/current/system; if (!-f $request_filename) { break; } if (-f $request_filename) { expires -1; break; } } # cache static files :P location ~ ^/(images|javascripts|stylesheets)/ { root /home/ruby/apps/staging.whotracker.com/current/public; if ($query_string ~* "^[0-9a-zA-Z]{40}$") { expires max; break; } if (!-f $request_filename) { break; } } if ( -f /home/ruby/apps/staging.whotracker.com/shared/offline ) { return 503; } location /blog { index index.php index.html index.htm; try_files $uri $uri/ /blog/index.php?q=$uri; } location ~ \.php$ { try_files $uri =404; include /etc/nginx/fastcgi_params; fastcgi_pass unix:/var/run/php-fastcgi/php-fastcgi.socket; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } location / { proxy_set_header HTTP_REFERER $http_referer; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_redirect off; proxy_max_temp_file_size 0; # If the file exists as a static file serve it directly without # running all the other rewite tests on it if (-f $request_filename) { break; } if (!-f $request_filename) { proxy_pass http://staging_whotracker; break; } } error_page 502 =503 @maintenance; error_page 500 504 /500.html; error_page 503 @maintenance; location @maintenance { rewrite ^(.*)$ /503.html break; } } unicorn.log executing ["/home/ruby/apps/staging.whotracker.com/shared/bundle/ruby/1.9.1/bin/unicorn_rails", "-c", "/home/ruby/apps/staging.whotracker.com/current/config/unicorn.rb", "-E", "staging", "-D", {5=>#<Kgio::UNIXServer:/home/ruby/apps/staging.whotracker.com/shared/sockets/.sock>}] (in /home/ruby/apps/staging.whotracker.com/releases/20120517114413) I, [2012-05-17T06:43:48.111717 #14636] INFO -- : inherited addr=/home/ruby/apps/staging.whotracker.com/shared/sockets/.sock fd=5 I, [2012-05-17T06:43:48.111938 #14636] INFO -- : Refreshing Gem list worker=0 ready ... master process ready ... reaped #<Process::Status: pid 2590 exit 0> worker=6 ... master complete Deploy goes successfully, but when i try to access beta.whotracker.com or ip-address i get SERVER NOT FOUND error, while others app works great. Nothing shows up in error logs. Can you please point me where is my fault?

    Read the article

  • nginx + apache subdomain redirection fault

    - by webwolf
    i really need your advice folks since i'm experiencing some troubles with nginx & apache2 subdomains configs first of all, there's a site (say, site.com) and two subdomains (links.site.com and shop.site.com) whose files are physically located at the same level of FS hierarchy as the site.com itself my hoster has configured both apache and nginx by my request, but it still doesn't work as it used to both of subdomains point to the main page of site.com for some unknown and implicit (for me) reason :( my assumption is that's happen because site.com record is placed first in both configs?!.. please help me solve this out! every opinion would be appreciated =) nginx.conf: server { listen 95.169.187.234:80; server_name site.com www.site.com ; access_log /home/www/site.com/logs/nginx.access.log main; location ~* ^.+\.(jpeg|jpg|gif|png|ico|css|zip|tgz|gz|rar|bz2|doc|xls|exe|pdf|ppt|txt|tar|mid|midi|wav|bmp|rtf|js|swf|avi|mp3|mpg|mpeg|asf|vmw)$ { expires 30d; root /home/www/site.com/www; } #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # error_page 500 502 503 504 /50x.html; location = /50x.html { root html; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # location ~ /\.ht { deny all; } location / { set $referer $http_referer; proxy_pass http://127.0.0.1:8080/; proxy_redirect off; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Referer $referer; proxy_set_header Host $host; client_max_body_size 10m; client_body_buffer_size 64k; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 90; proxy_buffer_size 4k; proxy_buffers 4 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; } } server { listen 95.169.187.234:80; server_name links.site.com www.links.site.com ; access_log /home/www/links.site.com/logs/nginx.access.log main; location ~* ^.+\.(jpeg|jpg|gif|png|ico|css|zip|tgz|gz|rar|bz2|doc|xls|exe|pdf|ppt|txt|tar|mid|midi|wav|bmp|rtf|js|swf|avi|mp3|mpg|mpeg|asf|vmw)$ { expires 30d; root /home/www/links.site.com/www; } #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # error_page 500 502 503 504 /50x.html; location = /50x.html { root html; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # location ~ /\.ht { deny all; } location / { set $referer $http_referer; proxy_pass http://127.0.0.1:8080/; proxy_redirect off; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Referer $referer; proxy_set_header Host $host; client_max_body_size 10m; client_body_buffer_size 64k; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 90; proxy_buffer_size 4k; proxy_buffers 4 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; } } server { listen 95.169.187.234:80; server_name shop.site.com www.shop.site.com ; access_log /home/www/shop.site.com/logs/nginx.access.log main; location ~* ^.+\.(jpeg|jpg|gif|png|ico|css|zip|tgz|gz|rar|bz2|doc|xls|exe|pdf|ppt|txt|tar|mid|midi|wav|bmp|rtf|js|swf|avi|mp3|mpg|mpeg|asf|vmw)$ { expires 30d; root /home/www/shop.site.com/www; } #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # error_page 500 502 503 504 /50x.html; location = /50x.html { root html; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # location ~ /\.ht { deny all; } location / { set $referer $http_referer; proxy_pass http://127.0.0.1:8080/; proxy_redirect off; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Referer $referer; proxy_set_header Host $host; client_max_body_size 10m; client_body_buffer_size 64k; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 90; proxy_buffer_size 4k; proxy_buffers 4 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; } } httpd.conf: # ServerRoot "/usr/local/apache2" PidFile /var/run/httpd.pid Timeout 300 KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 15 Listen 127.0.0.1:8080 NameVirtualHost 127.0.0.1:8080 ... #Listen *:80 NameVirtualHost *:80 ServerName www.site.com ServerAlias site.com UseCanonicalName Off CustomLog /home/www/site.com/logs/custom_log combined ErrorLog /home/www/site.com/logs/error_log DocumentRoot /home/www/site.com/www AllowOverride All Options +FollowSymLinks Options -MultiViews Options -Indexes Options Includes Order allow,deny Allow from all DirectoryIndex index.html index.htm index.php ServerName www.links.site.com ServerAlias links.site.com UseCanonicalName Off CustomLog /home/www/links.site.com/logs/custom_log combined ErrorLog /home/www/links.site.com/logs/error_log DocumentRoot /home/www/links.site.com/www AllowOverride All Options +FollowSymLinks Options -MultiViews Options -Indexes Options Includes Order allow,deny Allow from all DirectoryIndex index.html index.htm index.php ServerName www.shop.site.com ServerAlias shop.site.com UseCanonicalName Off CustomLog /home/www/shop.site.com/logs/custom_log combined ErrorLog /home/www/shop.site.com/logs/error_log DocumentRoot /home/www/shop.site.com/www AllowOverride All Options +FollowSymLinks Options -MultiViews Options -Indexes Options Includes Order allow,deny Allow from all DirectoryIndex index.html index.htm index.php # if DSO load module first: LoadModule rpaf_module modules/mod_rpaf-2.0.so RPAFenable On RPAFsethostname On RPAFproxy_ips 127.0.0.1 RPAFheader X-Forwarded-For Include conf/virthost/*.conf

    Read the article

  • Configuring ASMX Web Service End Points - web.config

    - by tyndall
    I have set up references to 2 web services in a separate assembly TestProj.Core. I reference this Project in a Web Application Project called TestProj.Web. When I setup the references in TestProj.Core the wizard gave me an app.config and through an application settings section into it. How do I get these settings to my web app? Copy and paste these into web.config? "Always Copy" the app.config out to the bin directory? Any good articles on mutiple configs?

    Read the article

  • Dynamically creating GWT screens using Metadata?

    - by Francis Shanahan
    I have an AWT applet application that needs to be ported over to GWT. The applet screens are described in meta data and the applet renders each screen dynamically using reflection. We'd like the same thing in GWT/ExtGWT. I've built a working version of this ExtJS whereby the metadata is turned into ExtJS Screen configs in the form of JSON. The drawback with this approach is the "wiring" of controls to data needs to be written in Javascript. GWT is preferred since it'd be all Java code, no JS. Upon digging in it's possible to render the screens using GWT off the metadata using GWT.create(). The problem I'm having is the wiring to hook a dynamically created button for example to an event handler requires reflection which is not supported in GWT. Is this conclusion correct? and if so, are there any other ways to achieve this type of dynamic UI using ExtGWT?

    Read the article

  • Zend routing, throws resource not found

    - by bluedaniel
    Ive got a url: http://dev.local/foodies/view?id=bluedaniel and ive got in my bootstrap: protected function _initRoute() { $config = new Zend_Config_Ini(APPLICATION_PATH . '/configs/routes.ini', 'production'); $router = new Zend_Controller_Router_Rewrite(); $router->addConfig($config, 'resources'); } and ive also got in my routes.ini: [production] resources.router.routes.foodies_view.route = ":foodies/:id" resources.router.routes.foodies_view.defaults.module = "foodies" resources.router.routes.foodies_view.defaults.controller = "view" resources.router.routes.foodies_view.defaults.action = "index" so http://dev.local/foodies/bluedaniel should work right? I get a Resource 'foodies:bluedaniel' not found error however with this setup

    Read the article

  • Git & Web-design: handling multiple customized templates

    - by o_O Tync
    I'm developing a CMS (with Django, but that doesn't matter) and have chosen GIT. Installations will vary in: Configs Database contents Media Templates First 3 are not a problem with git: we simply don't need these :) While developing, I have 1 default template with related media. Later, each customer will receive his own design based on default templates (some slight customization). I'm not going to support each of the custom templates as I introduce new features. Modularity helps with this but is not a 100% solution. Do you have any experience to share?

    Read the article

  • Auto detect internal/external development environment

    - by zaf
    We use the following function to auto detect if we are on a machine internally or on a live server and then choose the appropriate configs for various components. function devIsLocal(){ $res=false; $http_host=$_SERVER['HTTP_HOST']; if($http_host=='localhost')$res=true; if($http_host=='127.0.0.1')$res=true; if(substr($http_host,-4)=='.lan')$res=true; if(strpos($http_host, '.')===false)$res=true; return($res); } As you can see it only relies on the HTTP_HOST value. Of course, if you have use virtual host locally like example.com then the function will be tricked. Are there any other ways to fool the function? and what other variables/places could we peek at to determine where we are?

    Read the article

  • Can't get Zend Studio and PHPunit to work together

    - by dimbo
    I have a created a simple doctrine2/zend skeleton project and am trying to get unit testing working with zend studio. The tests work perfectly through the PHPunit CLI but I just can't get them to work in zend studio. It comes up with an error saying : 'No Tests was executed' and the following output in the debug window : X-Powered-By: PHP/5.2.14 ZendServer/5.0 Set-Cookie: ZendDebuggerCookie=127.0.0.1%3A10137%3A0||084|77742D65|1016; path=/ Content-type: text/html <br /> <b>Warning</b>: Unexpected character in input: '\' (ASCII=92) state=1 in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>8</b><br /> <br /> <b>Warning</b>: Unexpected character in input: '\' (ASCII=92) state=1 in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>8</b><br /> <br /> <b>Parse error</b>: syntax error, unexpected T_STRING in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>8</b><br /> The test is as follows: <?php require_once 'Zend/Application.php'; require_once 'Zend/Test/PHPUnit/ControllerTestCase.php'; abstract class ControllerTestCase extends Zend_Test_PHPUnit_ControllerTestCase { public function setUp() { $this->bootstrap = new Zend_Application( 'testing', APPLICATION_PATH . '/configs/application.ini' ); parent::setUp(); } public function tearDown() { parent::tearDown(); } } <?php class IndexControllerTest extends ControllerTestCase { public function testDoesHomePageExist() { $this->dispatch('/'); $this->assertController('index'); $this->assertAction('index'); } } <?php class ModelTestCase extends PHPUnit_Framework_TestCase { protected $em; public function setUp() { $application = new Zend_Application( 'testing', APPLICATION_PATH . '/configs/application.ini' ); $bootstrap = $application->bootstrap()->getBootstrap(); $this->em = $bootstrap->getResource('entityManager'); parent::setUp(); } public function tearDown() { parent::tearDown(); } } <?php class UserModelTest extends ModelTestCase { public function testCanInstantiateUser() { $this->assertInstanceOf('\Entities\User', new \Entities\User); } public function testCanSaveAndRetrieveUser() { $user = new \Entities\User; $user->setFirstname('wjgilmore-test'); $user->setemail('[email protected]'); $user->setpassword('jason'); $user->setAddress1('calle san antonio'); $user->setAddress2('albayzin'); $user->setSurname('testman'); $user->setConfirmed(TRUE); $this->em->persist($user); $this->em->flush(); $user = $this->em->getRepository('Entities\User')->findOneByFirstname('wjgilmore-test'); $this->assertEquals('wjgilmore-test', $user->getFirstname()); } public function testCanDeleteUser() { $user = new \Entities\User; $user = $this->em->getRepository('Entities\User')->findOneByFirstname('wjgilmore-test'); $this->em->remove($user); $this->em->flush(); } } And the bootstrap: <?php define('BASE_PATH', realpath(dirname(__FILE__) . '/../../')); define('APPLICATION_PATH', BASE_PATH . '/application'); set_include_path( '.' . PATH_SEPARATOR . BASE_PATH . '/library' . PATH_SEPARATOR . get_include_path() ); require_once 'controllers/ControllerTestCase.php'; require_once 'models/ModelTestCase.php'; Here is the new error after setting PHP Executable to 5.3 as Gordon suggested: X-Powered-By: PHP/5.3.3 ZendServer/5.0 Set-Cookie: ZendDebuggerCookie=127.0.0.1%3A10137%3A0||084|77742D65|1000; path=/ Content-type: text/html <br /> <b>Fatal error</b>: Class 'ModelTestCase' not found in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>4</b><br />

    Read the article

  • Castle Windsor: Inject NameValueCollection vs. Dictionary

    - by Aren B
    I've already done many configs where dictionaries are passed into services in the <parameters> block. But what I find myself needing right now is to build a NameValueCollection (allowing multiple entries with the same key) or a Collection of KeyValuePair objects. The reason for this is im not using this dictionary to look up b when given a, im basically using it to pass in a Tuple (pair) of (a,b) to be used later in code. Im kind of new to castle windor and I was wondering how i would go about making a List of KeyValuePair's injected, or a NameValueCollection injected.

    Read the article

  • I want to version control my entire slice

    - by Tom
    I'm renting a slice (i.e., a VPS) from Slicehost. I've a spent a day or two filling up /usr with my favorite packages, /etc with configs and init scripts, and so on. Now I want to: save this whole setup somewhere (e.g., to load onto another machine). see what changes I've made to which files revert changes, tag revisions, and all that other good version control stuff Saving a disk image gives me (1), but not (2) and (3). Using Subversion (svn import / svn://someotherhost) might give me all three, but I expect problems if I actually try to check a project out into / and maintain .svn directories in root-owned areas. And to load my setup onto a fresh slice, I'd need to install an svn client on it first. Is there a good way to do what I want to do?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >