Search Results

Search found 8997 results on 360 pages for 'apt cache'.

Page 180/360 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • cant install fastcgi ubuntu server: Package libapache2-mod-fastcgi is not available

    - by BlueDragon
    When i try to install fastcgi in ubuntu server 12.04 I get the following error: sudo apt-get install libapache2-mod-fastcgi Reading package lists... Done Building dependency tree Reading state information... Done Package libapache2-mod-fastcgi is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'libapache2-mod-fastcgi' has no installation candidate Any solution?

    Read the article

  • Trouble determining proper decoding of a REST response from an ArcGIS REST service using IHttpModule

    - by Ryan Taylor
    First a little background on what I am trying to achieve. I have an application that is utilizing REST services served by ArcGIS Server and IIS7. The REST services return data in one of several different formats. I am requesting a JSON response. I want to be able to modify the response (remove or add parameters) before the response is sent to the client. However, I am having difficulty converting the stream to a string that I can modify. To that end, I have implemented the following code in order to try to inspect the stream. SecureModule.cs using System; using System.Web; namespace SecureModuleTest { public class SecureModule : IHttpModule { public void Init(HttpApplication context) { context.BeginRequest += new EventHandler(OnBeginRequest); } public void Dispose() { } public void OnBeginRequest(object sender, EventArgs e) { HttpApplication application = (HttpApplication) sender; HttpContext context = application.Context; HttpRequest request = context.Request; HttpResponse response = context.Response; response.Filter = new ServicesFilter(response.Filter); } } } ServicesFilter.cs using System; using System.IO; using System.Text; namespace SecureModuleTest { class ServicesFilter : MemoryStream { private readonly Stream _outputStream; private StringBuilder _content; public ServicesFilter(Stream output) { _outputStream = output; _content = new StringBuilder(); } public override void Write(byte[] buffer, int offset, int count) { _content.Append(Encoding.UTF8.GetString(buffer, offset, count)); using (TextWriter textWriter = new StreamWriter(@"C:\temp\content.txt", true)) { textWriter.WriteLine(String.Format("Buffer: {0}", _content.ToString())); textWriter.WriteLine(String.Format("Length: {0}", buffer.Length)); textWriter.WriteLine(String.Format("Offset: {0}", offset)); textWriter.WriteLine(String.Format("Count: {0}", count)); textWriter.WriteLine(""); textWriter.Close(); } // Modify response _outputStream.Write(buffer, offset, count); } } } The module is installed in the /ArcGIS/rest/ virtual directory and is executed via the following GET request. http://localhost/ArcGIS/rest/services/?f=json&pretty=true The web page displays the expected response, however, the text file tells a very different (encoded?) story. Expect Response {"currentVersion" : "10.0", "folders" : [], "services" : [ ] } Text File Contents Buffer: ? ?`I?%&/m?{J?J??t??`$?@??????iG#)?*??eVe]f@????{???{???;?N'????\fdl??J??!????~|?"~?G?u]???'?)??G?????G??7N????W??{?????,??|?OR????q? Length: 4096 Offset: 0 Count: 168 Buffer: ? ?`I?%&/m?{J?J??t??`$?@??????iG#)?*??eVe]f@????{???{???;?N'????\fdl??J??!????~|?"~?G?u]???'?)??G?????G??7N????W??{?????,??|?OR????q?K???!P Length: 4096 Offset: 0 Count: 11 Interestingly, Fiddler depicts a similar picture. Fiddler Request GET http://localhost/ArcGIS/rest/services/?f=json&pretty=true HTTP/1.1 Host: localhost Connection: keep-alive User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.70 Safari/533.4 Referer: http://localhost/ArcGIS/rest/services Cache-Control: no-cache Pragma: no-cache Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: a=mWz_JFOusuGPnS3w5xx1BSUuyKGB3YZo92Dy2SUntP2MFWa8MaVq6a4I_IYBLKuefXDZANQMeqvxdGBgQoqTKz__V5EQLHwxmKlUNsaK7do. Fiddler Response - Before Clicking Decode HTTP/1.1 200 OK Content-Type: text/plain;charset=utf-8 Content-Encoding: gzip ETag: 719143506 Server: Microsoft-IIS/7.5 X-Powered-By: ASP.NET Date: Thu, 10 Jun 2010 01:08:43 GMT Content-Length: 179 ????????`I?%&/m?{J?J??t??`$?@??????iG#)?*??eVe]f@????{???{???;?N'????\fdl??J??!????~|?"~?G?u]???'?)??G?????G??7N????W??{?????,??|?OR????q?K???! P??? Fiddler Response - After Clicking Decode HTTP/1.1 200 OK Content-Type: text/plain;charset=utf-8 ETag: 719143506 Server: Microsoft-IIS/7.5 X-Powered-By: ASP.NET Date: Thu, 10 Jun 2010 01:08:43 GMT Content-Length: 80 {"currentVersion" : "10.0", "folders" : [], "services" : [ ] } I think that the problem may be a result of compression and/or chunking of data (this might be why I am receiving two calls to ServicesFilter.Write(...), however, I have not yet been able to solve the issue. How might I decode, unzip, and otherwise convert the byte stream into the string I know it should be for modification by my filter?

    Read the article

  • How I can export a datatable to MS word 2007, excel 2007,csv from asp.net?

    - by bala3569
    Hi, I am using the below code to Export DataTable to MS Word,Excel,CSV format & it's working fine. But problem is that this code export to MS Word 2003,Excel 2003 version. I need to Export my DataTable to Word 2007,Excel 2007,CSV because I am supposed to handle more than 100,000 records at a time and as we know Excel 2003 supports for only 65,000 records. Please help me out if you know that how to export DataTable or DataSet to MS Word 2007,Excel 2007. public static void Convertword(DataTable dt, HttpResponse Response,string filename) { Response.Clear(); Response.AddHeader("content-disposition", "attachment;filename=" + filename + ".doc"); Response.Charset = ""; Response.Cache.SetCacheability(HttpCacheability.NoCache); Response.ContentType = "application/vnd.word"; System.IO.StringWriter stringWrite = new System.IO.StringWriter(); System.Web.UI.HtmlTextWriter htmlWrite = new System.Web.UI.HtmlTextWriter(stringWrite); System.Web.UI.WebControls.GridView dg = new System.Web.UI.WebControls.GridView(); dg.DataSource = dt; dg.DataBind(); dg.RenderControl(htmlWrite); Response.Write(stringWrite.ToString()); Response.End(); //HttpContext.Current.ApplicationInstance.CompleteRequest(); } public static void Convertexcel(DataTable dt, HttpResponse Response, string filename) { Response.Clear(); Response.AddHeader("content-disposition", "attachment;filename=" + filename + ".xls"); Response.Charset = ""; Response.Cache.SetCacheability(HttpCacheability.NoCache); Response.ContentType = "application/vnd.ms-excel"; System.IO.StringWriter stringWrite = new System.IO.StringWriter(); System.Web.UI.HtmlTextWriter htmlWrite = new System.Web.UI.HtmlTextWriter(stringWrite); System.Web.UI.WebControls.DataGrid dg = new System.Web.UI.WebControls.DataGrid(); dg.DataSource = dt; dg.DataBind(); dg.RenderControl(htmlWrite); Response.Write(stringWrite.ToString()); Response.End(); //HttpContext.Current.ApplicationInstance.CompleteRequest(); } public static void ConvertCSV(DataTable dataTable, HttpResponse Response, string filename) { Response.Clear(); Response.Buffer = true; Response.AddHeader("content-disposition", "attachment;filename=" + filename + ".csv"); Response.Charset = ""; Response.Cache.SetCacheability(HttpCacheability.NoCache); Response.ContentType = "Application/x-msexcel"; StringBuilder sb = new StringBuilder(); if (dataTable.Columns.Count != 0) { foreach (DataColumn column in dataTable.Columns) { sb.Append(column.ColumnName + ','); } sb.Append("\r\n"); foreach (DataRow row in dataTable.Rows) { foreach (DataColumn column in dataTable.Columns) { if(row[column].ToString().Contains(',')==true) { row[column] = row[column].ToString().Replace(",", ""); } sb.Append(row[column].ToString() + ','); } sb.Append("\r\n"); } } Response.Write(sb.ToString()); Response.End(); //HttpContext.Current.ApplicationInstance.CompleteRequest(); }

    Read the article

  • Login box not shown on Ubuntu

    - by Alexandre
    Hi, I've installed Ubuntu 10.04 (64bit) as a guest OS in VirtualBox, using Windows7 Professional (64bit) as host. After Ubuntu install, I did installed Xfce4 (sudo apt-get install xfce4). Logged in using a Xfce session, and when I logged out, I couldn't see the login box anymore, only the regular gnome background from login screen. Then I restarted the virtual machine, and now I'm not able to see the login box anymore, only the gnome background. Does someone knows how to solve this? Thanks in advance

    Read the article

  • Error trying to get tmux 1.6 installed - E: Unable to locate package libevent

    - by Michael Durrant
    $ pwd $ /home/durrantm/Downloads/tmux-1.6 durrantm.../tmux-1.6$ ./configure && make checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes ... ... configure: error: "libevent not found" durrantm.../tmux-1.6$ sudo apt-get install libevent Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package libevent

    Read the article

  • Can't install libpq-dev, ubuntu 10.10 and postgres 9

    - by sheepwalker
    I need some headers from the dev-version of postgres 9, which is contained in libpq-dev, for installing the pg gem, but when I execute: sudo apt-get install libpq-dev I get the result: The following packages have unmet dependencies: libpq-dev : Depends: libpq5 (= 8.4.7-0ubuntu0.10.10) but 9.0.1-1~lucid is to be installed When I tried to remove libpq5 (to reinstall it correctly?), it threatened to remove postgresql-9.0: The following packages will be REMOVED: libpq5 pgadmin3 php5-pgsql postgresql-9.0 postgresql-client-9.0 Does anybody know how to solve this problem? Thanks.

    Read the article

  • Probelm After Changing password in ubuntu

    - by Narendra
    Hi All, I am using ubuntu system which uses openldap for user login authentication. For changing my login password i used "$sudo passwd" and changed it. (of course the password given is same as old one). Then onwards when i trying to run "apt" command it showing command not found. and for some other programs it showing segmentation fault. Can any one tell me why I am facing issue and how to solve this.

    Read the article

  • Snort install issue on debian 6 with libpcre - libpcre library not found

    - by Chuck
    I've read the manual on snort.org for installing snort on Debian but am still having an issue. Does anyone know how to resolve this? I've tried installing the libpcre3 amd libpcre3-dev packages by using apt-get and also manually installing by downloading the latest version off the tcpdump website. Any ideas? Checking for pcre-compile in -l pcre...no Error! Libpcre library not found. Get it from http://www.pcre.org

    Read the article

  • Ubuntu 10.04 server, problems installing the desktop

    - by ILMV
    Hi all, I have just setup two servers running 10.04 server and have installed the ubuntu-desktop as follows: sudo apt-get install ubuntu-desktop The problem is even though it says it has installed it will not auto-start... I've tried this: sudo mv /etc/init/gdm.conf /etc/init/gdm.disabled sudo mv /etc/init/gdm.disabled /etc/init/gdm.conf To enable/disable it but still not joy. Any ideas? Thanks, Ben

    Read the article

  • PHP curl post to login to wordpress

    - by Sadi
    I followed http://stackoverflow.com/questions/724107 to login to wordpress, using php_curl, and it works fine as far I use WAMP, (apache/php). But when it comes to IIS on the dedicated server, it returns nothing. I have wrote the following function which is working fine on my local wamp, but when deployed to client's dedicated windows server 2k3, it doesn't. Please help me function post_url($url, array $query_string) { //$url = http://myhost.com/wptc/sys/wp/wp-login.php /* $query_string = array( 'log'=>'admin', 'pwd'=>'test', 'redirect_to'=>'http://google.com', 'wp-submit'=>'Log%20In', 'testcookie'=>1 ); */ //temp_dir is defined as folder = path/to/a/folder $cookie= temp_dir."cookie.txt"; $c = curl_init($url); if (count($query_string)) { curl_setopt ($c, CURLOPT_POSTFIELDS, http_build_query( $query_string ) ); } curl_setopt($c, CURLOPT_POST, 1); curl_setopt($c, CURLOPT_COOKIEFILE, $cookie); //curl_setopt($c, CURLOPT_SSL_VERIFYPEER, 1); //curl_setopt($c, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6"); curl_setopt($c, CURLOPT_TIMEOUT, 60); curl_setopt($c, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($c, CURLOPT_RETURNTRANSFER, 1); //return the content curl_setopt($c, CURLOPT_COOKIEJAR, $cookie); //curl_setopt($c, CURLOPT_AUTOREFERER, 1); //curl_setopt($c, CURLOPT_REFERER, wp_admin_url); //curl_setopt($c, CURLOPT_MAXREDIRS, 10); curl_setopt($c, CURLOPT_HEADER, 0); //curl_setopt($c, CURLOPT_CRLF, 1); try { $result = curl_exec($c); } catch (Exception $e) { $result = 'error'; } curl_close ($c); return $result; //it return nothing (empty) } Other Facts curl_error($c); return nothing when header CURLOPT_HEADER is set to ON, it return this header HTTP/1.1 200 OK Cache-Control: no-cache, must-revalidate, max-age=0 Pragma: no-cache Content-Type: text/html; charset=UTF-8 Expires: Wed, 11 Jan 1984 05:00:00 GMT Last-Modified: Thu, 06 May 2010 21:06:30 GMT Server: Microsoft-IIS/7.0 X-Powered-By: PHP/5.2.13 Set-Cookie: wordpress_test_cookie=WP+Cookie+check; path=/wptc/sys/wp/ Set-Cookie: wordpress_b13661ceb5c3eba8b42d383be885d372=admin%7C1273352790%7C7d8ddfb6b1c0875c37c1805ab98f1e7b; path=/wptc/sys/wp/wp-content/plugins; httponly Set-Cookie: wordpress_b13661ceb5c3eba8b42d383be885d372=admin%7C1273352790%7C7d8ddfb6b1c0875c37c1805ab98f1e7b; path=/wptc/sys/wp/wp-admin; httponly Set-Cookie: wordpress_logged_in_b13661ceb5c3eba8b42d383be885d372=admin%7C1273352790%7Cb90825fb4a7d5da9b5dc4d99b4e06049; path=/wptc/sys/wp/; httponly Refresh: 0;url=http://myhost.com/wptc/sys/wp/wp-admin/ X-Powered-By: ASP.NET Date: Thu, 06 May 2010 21:06:30 GMT Content-Length: 0 CURL version info: Array ( [version_number] = 463872 [age] = 3 [features] = 2717 [ssl_version_number] = 0 [version] = 7.20.0 [host] = i386-pc-win32 [ssl_version] = OpenSSL/0.9.8k [libz_version] = 1.2.3 [protocols] = Array ( [0] = dict [1] = file [2] = ftp [3] = ftps [4] = http [5] = https [6] = imap [7] = imaps [8] = ldap [9] = pop3 [10] = pop3s [11] = rtsp [12] = smtp [13] = smtps [14] = telnet [15] = tftp ) ) PHP Version 5.2.13 Windows Server 2K3 IIS 7 Working fine on Apache, PHP 3.0 on my localhost (windows)

    Read the article

  • "tasksel: aptitude failed (100)" installing LAMP on Ubuntu (vagrant)

    - by John Mee
    vagrant@precise64:~$ sudo tasksel install lamp-server tasksel: aptitude failed (100) I'm hoping to get a LAMP server from a clean install of ubuntu. All the googles suggest tasksel, which going into a pretty ascii gui, downloads stuff, then dies with this message. I'm guessing I should try doing it via apt-get of the individual packages but seems a fair chance somebody else is going to search for this error message and want the right answer.

    Read the article

  • CURL - HTTPS Wierd error

    - by Vincent
    All, I am having trouble requesting info from HTTPS site using CURL and PHP. I am using Solaris 10. It so happens that sometimes it works and sometimes it doesn't. I am not sure what is the cause. If it doesn't work, this is the entry recorded in the verbose log: * About to connect() to 10.10.101.12 port 443 (#0) * Trying 10.10.101.12... * connected * Connected to 10.10.101.12 (10.10.101.12) port 443 (#0) * error setting certificate verify locations, continuing anyway: * CAfile: /etc/opt/webstack/curl/curlCA CApath: none * error:80089077:lib(128):func(137):reason(119) * Closing connection #0 If it works, this is the entry recorded in the verbose log: * About to connect() to 10.10.101.12 port 443 (#0) * Trying 10.10.101.12... * connected * Connected to 10.10.101.12 (10.10.101.12) port 443 (#0) * error setting certificate verify locations, continuing anyway: * CAfile: /etc/opt/webstack/curl/curlCA CApath: none * SSL connection using DHE-RSA-AES256-SHA * Server certificate: * subject: C=CA, ST=British Columnbia, L=Vancouver, O=google, OU=FDN, CN=g.googlenet.com, [email protected] * start date: 2007-07-24 23:06:32 GMT * expire date: 2027-09-07 23:06:32 GMT * issuer: C=US, ST=California, L=Sunnyvale, O=Google, OU=Certificate Authority, CN=support, [email protected] * SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway. > POST /gportal/gpmgr HTTP/1.1^M Host: 10.10.101.12^M Accept: */*^M Accept-Encoding: gzip,deflate^M Content-Length: 1623^M Content-Type: application/x-www-form-urlencoded^M Expect: 100-continue^M ^M < HTTP/1.1 100 Continue^M < HTTP/1.1 200 OK^M < Date: Wed, 28 Apr 2010 21:56:15 GMT^M < Server: Apache^M < Cache-Control: no-cache^M < Pragma: no-cache^M < Vary: Accept-Encoding^M < Content-Encoding: gzip^M < Content-Length: 1453^M < Content-Type: application/json^M < ^M * Connection #0 to host 10.10.101.12 left intact * Closing connection #0 My CURL options are as under: $ch = curl_init(); $devnull = fopen('/tmp/curlcookie.txt', 'w'); $fp_err = fopen('/tmp/verbose_file.txt', 'ab+'); fwrite($fp_err, date('Y-m-d H:i:s')."\n\n"); curl_setopt($ch, CURLOPT_STDERR, $devnull); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_URL, $desturl); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,120); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_ENCODING, 'gzip,deflate'); curl_setopt($ch, CURLOPT_POSTFIELDS, $postdata); curl_setopt($ch, CURLOPT_VERBOSE,1); curl_setopt($ch, CURLOPT_FAILONERROR, true); curl_setopt($ch, CURLOPT_STDERR, $fp_err); $ret = curl_exec($ch); Anybody has any idea, why it works sometimes but fails mostly? Thanks

    Read the article

  • 500 Internal Server Error with PHP application

    - by James
    I have written a PHP application using Windows and XAMPP. I've been trying to run it on Ubuntu 10.10 with Lighttpd 1.4.26. Parts of the application work fine, but whenever I try to log in, I get a 500 - Internal Server Error page. The only thing that shows up in /var/log/lighttpd/error.log is 2011-02-25 13:43:13: (mod_fastcgi.c.2582) unexpected end-of-file (perhaps the fastcgi process died): pid: 1169 socket: unix:/tmp/php.socket-0 2011-02-25 13:43:13: (mod_fastcgi.c.3367) response not received, request sent: 1596 on socket: unix:/tmp/php.socket-0 for /~denton/customer-facing-portal/index.php?, closing connection If I had any output whatsoever from PHP, this would be a lot easier to debug. Any ideas on how to get some? Here is my /etc/lighttpd/lighttpd.conf file: # Debian lighttpd configuration file # ############ Options you really have to take care of #################### ## modules to load server.modules = ( "mod_alias", "mod_compress", # "mod_rewrite", # "mod_redirect", # "mod_usertrack", # "mod_expire", # "mod_flv_streaming", # "mod_evasive", "mod_setenv" ) ## a static document-root, for virtual-hosting take look at the ## server.virtual-* options server.document-root = "/var/www/" ## where to upload files to, purged daily. server.upload-dirs = ( "/var/cache/lighttpd/uploads" ) ## where to send error-messages to server.errorlog = "/var/log/lighttpd/error.log" ## files to check for if .../ is requested index-file.names = ( "index.php", "index.html", "index.htm", "default.htm", "index.lighttpd.html" ) ## Use the "Content-Type" extended attribute to obtain mime type if possible # mimetype.use-xattr = "enable" ## # which extensions should not be handle via static-file transfer # # .php, .pl, .fcgi are most often handled by mod_fastcgi or mod_cgi static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" ) ######### Options that are good to be but not neccesary to be changed ####### ## Use ipv6 only if available. (disabled for while, check #560837) #include_shell "/usr/share/lighttpd/use-ipv6.pl" ## bind to port (default: 80) # server.port = 81 ## bind to localhost only (default: all interfaces) ## server.bind = "localhost" ## error-handler for status 404 #server.error-handler-404 = "/error-handler.html" #server.error-handler-404 = "/error-handler.php" ## to help the rc.scripts server.pid-file = "/var/run/lighttpd.pid" ## ## Format: <errorfile-prefix><status>.html ## -> ..../status-404.html for 'File not found' #server.errorfile-prefix = "/var/www/" ## virtual directory listings dir-listing.encoding = "utf-8" server.dir-listing = "enable" ### only root can use these options # # chroot() to directory (default: no chroot() ) #server.chroot = "/" ## change uid to <uid> (default: don't change) server.username = "www-data" ## change gid to <gid> (default: don't change) server.groupname = "www-data" #### compress module compress.cache-dir = "/var/cache/lighttpd/compress/" compress.filetype = ("text/plain", "text/html", "application/x-javascript", "text/css") #### url handling modules (rewrite, redirect, access) # url.rewrite = ( "^/$" => "/server-status" ) # url.redirect = ( "^/wishlist/(.+)" => "http://www.123.org/$1" ) #### expire module # expire.url = ( "/buggy/" => "access 2 hours", "/asdhas/" => "access plus 1 seconds 2 minutes") #### external configuration files ## mimetype mapping include_shell "/usr/share/lighttpd/create-mime.assign.pl" ## load enabled configuration files, ## read /etc/lighttpd/conf-available/README first include_shell "/usr/share/lighttpd/include-conf-enabled.pl" ## Set environment variables setenv.add-environment = ( "DB_URL__DEMO" => "192.168.1.231", "DB_NAME_DEMO" => "demo", "DB_USER_DEMO" => "user", "DB_PASS_DEMO" => "password", "DB_AGENCY_DEMO" => "demo" ) Here is my /etc/php5/cgi/php.ini file (sans 1641 lines of comments): [PHP] register_long_arrays = Off short_open_tag = Off engine = On short_open_tag = Off asp_tags = Off precision = 14 y2k_compliance = On output_buffering = 4096 zlib.output_compression = Off implicit_flush = Off unserialize_callback_func = serialize_precision = 100 allow_call_time_pass_reference = Off safe_mode = Off safe_mode_gid = Off safe_mode_include_dir = safe_mode_exec_dir = safe_mode_allowed_env_vars = PHP_ safe_mode_protected_env_vars = LD_LIBRARY_PATH disable_functions = disable_classes = expose_php = On max_execution_time = 30 max_input_time = 60 memory_limit = 128M error_reporting = E_ALL & ~E_DEPRECATED & ~E_STRICT display_errors = On display_startup_errors = On log_errors = On log_errors_max_len = 1024 ignore_repeated_errors = Off ignore_repeated_source = Off report_memleaks = On track_errors = On html_errors = On variables_order = "GPCS" request_order = "GP" register_globals = Off register_long_arrays = Off register_argc_argv = Off auto_globals_jit = On post_max_size = 8M magic_quotes_gpc = Off magic_quotes_runtime = Off magic_quotes_sybase = Off auto_prepend_file = auto_append_file = default_mimetype = "text/html" doc_root = user_dir = enable_dl = Off cgi.fix_pathinfo=1 file_uploads = On upload_max_filesize = 2M max_file_uploads = 20 allow_url_fopen = On allow_url_include = Off default_socket_timeout = 60 [Date] date.timezone = "America/Chicago" [filter] [iconv] [intl] [sqlite] [sqlite3] [Pcre] [Pdo] [Pdo_mysql] pdo_mysql.cache_size = 2000 pdo_mysql.default_socket= [Phar] [Syslog] define_syslog_variables = Off [mail function] SMTP = localhost smtp_port = 25 mail.add_x_header = On [SQL] sql.safe_mode = Off [ODBC] odbc.allow_persistent = On odbc.check_persistent = On odbc.max_persistent = -1 odbc.max_links = -1 odbc.defaultlrl = 4096 odbc.defaultbinmode = 1 [Interbase] ibase.allow_persistent = 1 ibase.max_persistent = -1 ibase.max_links = -1 ibase.timestampformat = "%Y-%m-%d %H:%M:%S" ibase.dateformat = "%Y-%m-%d" ibase.timeformat = "%H:%M:%S" [MySQL] mysql.allow_local_infile = On mysql.allow_persistent = On mysql.cache_size = 2000 mysql.max_persistent = -1 mysql.max_links = -1 mysql.default_port = mysql.default_socket = mysql.default_host = mysql.default_user = mysql.default_password = mysql.connect_timeout = 60 mysql.trace_mode = Off [MySQLi] mysqli.max_persistent = -1 mysqli.allow_persistent = On mysqli.max_links = -1 mysqli.cache_size = 2000 mysqli.default_port = 3306 mysqli.default_socket = mysqli.default_host = mysqli.default_user = mysqli.default_pw = mysqli.reconnect = Off [mysqlnd] mysqlnd.collect_statistics = On mysqlnd.collect_memory_statistics = Off [OCI8] [PostgresSQL] pgsql.allow_persistent = On pgsql.auto_reset_persistent = Off pgsql.max_persistent = -1 pgsql.max_links = -1 pgsql.ignore_notice = 0 pgsql.log_notice = 0 [Sybase-CT] sybct.allow_persistent = On sybct.max_persistent = -1 sybct.max_links = -1 sybct.min_server_severity = 10 sybct.min_client_severity = 10 [bcmath] bcmath.scale = 0 [browscap] [Session] session.save_handler = files session.use_cookies = 1 session.use_only_cookies = 1 session.name = PHPSESSID session.auto_start = 0 session.cookie_lifetime = 0 session.cookie_path = / session.cookie_domain = session.cookie_httponly = session.serialize_handler = php session.gc_probability = 1 session.gc_divisor = 1000 session.gc_maxlifetime = 1440 session.bug_compat_42 = Off session.bug_compat_warn = Off session.referer_check = session.entropy_length = 0 session.cache_limiter = nocache session.cache_expire = 180 session.use_trans_sid = 0 session.hash_function = 0 session.hash_bits_per_character = 5 url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=fakeentry" [MSSQL] mssql.allow_persistent = On mssql.max_persistent = -1 mssql.max_links = -1 mssql.min_error_severity = 10 mssql.min_message_severity = 10 mssql.compatability_mode = Off mssql.secure_connection = Off [Assertion] [COM] [mbstring] [gd] [exif] [Tidy] tidy.clean_output = Off [soap] soap.wsdl_cache_enabled=1 soap.wsdl_cache_dir="/tmp" soap.wsdl_cache_ttl=86400 soap.wsdl_cache_limit = 5 [sysvshm] [ldap] ldap.max_links = -1 [mcrypt] [dba] Update: here is /etc/lighttpd/conf-enabled/15-fastcgi-php.conf As far as I know, it's just the default config file the Ubuntu package installed. ## FastCGI programs have the same functionality as CGI programs, ## but are considerably faster through lower interpreter startup ## time and socketed communication ## ## Documentation: /usr/share/doc/lighttpd-doc/fastcgi.txt.gz ## http://redmine.lighttpd.net/projects/lighttpd/wiki/Docs:ConfigurationOptions#mod_fastcgi-fastcgi ## Start an FastCGI server for php (needs the php5-cgi package) fastcgi.server += ( ".php" => (( "bin-path" => "/usr/bin/php-cgi", "socket" => "/tmp/php.socket", "max-procs" => 1, "idle-timeout" => 20, "bin-environment" => ( "PHP_FCGI_CHILDREN" => "4", "PHP_FCGI_MAX_REQUESTS" => "10000" ), "bin-copy-environment" => ( "PATH", "SHELL", "USER" ), "broken-scriptfilename" => "enable" )) )

    Read the article

  • Can't upgrade my Ubuntu server, it gets stuck on openjdk-6-jre-headless

    - by Jean-Nicolas Boulay Desjardins
    I am using Ubuntu Server. When I do: apt-get upgrade it gets stuck on: Setting up openjdk-6-jre-headless (6b20-1.9.7-0ubuntu1) ... Why? And what can I do to stop it? I tried removing it with apt-get... I get this error: E: dpkg was interrupted, you must manually run 'sudo dpkg --configure -a' to correct the problem. So then I tried this: dpkg --purge openjdk-6-jre-headless I got this: dpkg: dependency problems prevent removal of openjdk-6-jre-headless: openjdk-6-jre-lib depends on openjdk-6-jre-headless (>= 6b17). ca-certificates-java depends on openjdk-6-jre-headless (>= 6b16-1.6.1-2) | java6-runtime-headless; however: Package openjdk-6-jre-headless is to be removed. Package java6-runtime-headless is not installed. Package openjdk-6-jre-headless which provides java6-runtime-headless is to be removed. ca-certificates-java depends on openjdk-6-jre-headless (>= 6b16-1.6.1-2) | java6-runtime-headless; however: Package openjdk-6-jre-headless is to be removed. Package java6-runtime-headless is not installed. Package openjdk-6-jre-headless which provides java6-runtime-headless is to be removed. dpkg: error processing openjdk-6-jre-headless (--purge): dependency problems - not removing Errors were encountered while processing: openjdk-6-jre-headless The thing is I think my DB is using it... Not sure... I am using Cassandra with Thrift... Yes, it's getting a bit more complex... # dpkg --configure -a I get: dpkg: dependency problems prevent configuration of openjdk-6-jre: openjdk-6-jre depends on openjdk-6-jre-headless (>= 6b20-1.9.7-0ubuntu1); however: Package openjdk-6-jre-headless is not configured yet. dpkg: error processing openjdk-6-jre (--configure): dependency problems - leaving unconfigured Processing triggers for libc-bin ... ldconfig deferred processing now taking place dpkg: dependency problems prevent configuration of libaccess-bridge-java: libaccess-bridge-java depends on default-jre | openjdk-6-jre | sun-java6-jre; however: Package default-jre is not installed. Package openjdk-6-jre is not configured yet. Package sun-java6-jre is not installed. dpkg: error processing libaccess-bridge-java (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of icedtea-6-jre-cacao: icedtea-6-jre-cacao depends on openjdk-6-jre-headless (= 6b20-1.9.7-0ubuntu1); however: Package openjdk-6-jre-headless is not configured yet. dpkg: error processing icedtea-6-jre-cacao (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libaccess-bridge-java-jni: libaccess-bridge-java-jni depends on libaccess-bridge-java (>= 1.26.2-5); however: Package libaccess-bridge-java is not configured yet. dpkg: error processing libaccess-bridge-java-jni (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: openjdk-6-jre libaccess-bridge-java icedtea-6-jre-cacao libaccess-bridge-java-jni Thanks again for any help.

    Read the article

  • mysql: job failded to start. mysqld.sock is missing

    - by Freefri
    How can I fix this and start mysql-server? After /etc/init.d/mysql start or service mysql start I get the message start: "Job failed to start" And after # mysqld I get this: mysqld 121123 11:33:33 [ERROR] Can't find messagefile '/usr/share/mysql/errmsg.sys' 121123 11:33:33 [Note] Plugin 'FEDERATED' is disabled. mysqld: Unknown error 1146 121123 11:33:33 [ERROR] Can't open the mysql.plugin table. Please run mysql_upgrade to create it. 121123 11:33:33 InnoDB: The InnoDB memory heap is disabled 121123 11:33:33 InnoDB: Mutexes and rw_locks use GCC atomic builtins 121123 11:33:33 InnoDB: Compressed tables use zlib 1.2.3.4 121123 11:33:33 InnoDB: Initializing buffer pool, size = 128.0M 121123 11:33:33 InnoDB: Completed initialization of buffer pool 121123 11:33:33 InnoDB: highest supported file format is Barracuda. 121123 11:33:33 InnoDB: Waiting for the background threads to start 121123 11:33:34 InnoDB: 1.1.8 started; log sequence number 1595675 121123 11:33:34 [ERROR] Aborting 121123 11:33:34 InnoDB: Starting shutdown... 121123 11:33:35 InnoDB: Shutdown completed; log sequence number 1595675 121123 11:33:35 [Note] I try to do what mysql say me to do: mysql_upgrade Looking for 'mysql' as: mysql Looking for 'mysqlcheck' as: mysqlcheck Running 'mysqlcheck' with connection arguments: '--port=3306' '--socket=/var/run/mysqld/mysqld.sock' mysqlcheck: Got error: 2002: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) when trying to connect FATAL ERROR: Upgrade failed And yes, /var/run/mysql is empty: mysql_upgrade Looking for 'mysql' as: mysql Looking for 'mysqlcheck' as: mysqlcheck Running 'mysqlcheck' with connection arguments: '--port=3306' '--socket=/var/run/mysqld/mysqld.sock' mysqlcheck: Got error: 2002: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) when trying to connect FATAL ERROR: Upgrade failed And this is my file /etc/mysql/my.cnf # cat /etc/mysql/my.cnf |grep sock # Remember to edit /etc/mysql/debian.cnf when changing the socket location. socket = /var/run/mysqld/mysqld.sock socket = /var/run/mysqld/mysqld.sock socket = /var/run/mysqld/mysqld.sock Then I try to reinstall mysql from cero: apt-get purge mysql-client mysql-common mysql-server rm -R /var/lib/mysql rm -R /etc/mysql rm -R /var/run/mysqld userdel mysql apt-get install mysql-server mysql-client Then, after typing my root password for mysql I get this error: | Unable to set password for the MySQL "root" user ¦ ¦ ¦ ¦ An error occurred while setting the password for the MySQL administrative ¦ ¦ user. This may have happened because the account already has a password, or ¦ ¦ because of a communication problem with the MySQL server. ¦ ¦ ¦ ¦ You should check the account's password after the package installation. ¦ ¦ ¦ ¦ Please read the /usr/share/doc/mysql-server-5.5/README.Debian file for more ¦ ¦ information. And again I can't start mysql getting the same messages.

    Read the article

  • Compiling PHP with LDAP support on Ubuntu 12.10

    - by Andrew Ellis
    I am trying to compile PHP on Ubuntu 12.10 with LDAP support. I have run: apt-get install libldap2-dev That installs the header files to /usr/include. However, when attempting to compile it is unable to locate the header files. I have tried to with --with-ldap=/usr/include as well and it still fails with: configure: error: Cannot find ldap.h I also tried symlinking with the following and I still get the same error: ln -s /usr/lib/ldap* /usr/lib/ Thanks in advance for your help.

    Read the article

  • Subversion - Retrieval of mergeinfo unsupported

    - by jamesthomson
    Hi, I've recently updated my Subversion package on Debian Etch to 1.5.1 via a back-port. I've gone through what I believe are all the appropriate steps but cannot for the life of me get past the following error message when I try to merge: Retrieval of mergeinfo unsupported by '.' The '.' isn't important as I get the same message whether I'm SSH'd on to the server or using TortoiseSVN through Windows. I'll take you through what I did to upgrade and test step by step: Update of Subversion Added the following line to /etc/apt/sources.list: deb http://www.backports.org/debian etch-backports main contrib non-free and then ran apt-get -s -t etch-backports install subversion Checked the version of the subversion installation Done this by running svnadmin --version and got the following output: svnadmin, version 1.5.1 (r32289) compiled Dec 11 2008, 18:10:14 Checked the client too using svn --version and got the following svn, version 1.5.1 (r32289) compiled Dec 11 2008, 18:10:14 Ok, so all looking good so far. Now I just need to upgrade the repository. After plenty of research, the most foolproof way to do this seemed to be to dump the repository and then load it again. So here's what I did: svnadmin dump /var/svn/repo > repo.dump rm -aR /var/svn/repo/* svnadmin create /var/svn/repo svnadmin load < repo.dump All that seemed to work fine. I then checked to see if the repository had been upgraded by looking at the contents of /var/svn/repo/db/format which gave: 3 layout sharded 1000 Again this indicated a Subversion 1.5 repository so all looking good. Now I try and do a merge using the Subversion client in Debian: svn mergeinfo https://mysvn/repo . and I get the following error: svn: Retrieval of mergeinfo unsupported by '.' I get the same error message whether I'm using the Debian shell on the same server or if I'm connecting via TortoiseSVN and a Windows box. If I browse to the repository using my web browser, the version number at the bottom reads: Powered by Subversion version 1.4.2 (r22196). In case it helps, the created date on mod_dav_svn.so is 2009-08-06 18:29 I just cannot figure out why I'm getting this message so any help pointing me in the right direction would be greatly appreciated. All the forum and mailing list posts that I found relating to this error were solved by doing an svnadmin upgrade, though I have actually tried that and still no joy. Thanks in advance, James.

    Read the article

  • Using UUIDs for cheap equals() and hashCode()

    - by Tom McIntyre
    I have an immutable class, TokenList, which consists of a list of Token objects, which are also immutable: @Immutable public final class TokenList { private final List<Token> tokens; public TokenList(List<Token> tokens) { this.tokens = Collections.unmodifiableList(new ArrayList(tokens)); } public List<Token> getTokens() { return tokens; } } I do several operations on these TokenLists that take multiple TokenLists as inputs and return a single TokenList as the output. There can be arbitrarily many TokenLists going in, and each can have arbitrarily many Tokens. These operations are expensive, and there is a good chance that the same operation (ie the same inputs) will be performed multiple times, so I would like to cache the outputs. However, performance is critical, and I am worried about the expense of performing hashCode() and equals() on these objects that may contain arbitrarily many elements (as they are immutable then hashCode could be cached, but equals will still be expensive). This led me to wondering whether I could use a UUID to provide equals() and hashCode() simply and cheaply by making the following updates to TokenList: @Immutable public final class TokenList { private final List<Token> tokens; private final UUID uuid; public TokenList(List<Token> tokens) { this.tokens = Collections.unmodifiableList(new ArrayList(tokens)); this.uuid = UUID.randomUUID(); } public List<Token> getTokens() { return tokens; } public UUID getUuid() { return uuid; } } And something like this to act as a cache key: @Immutable public final class TopicListCacheKey { private final UUID[] uuids; public TopicListCacheKey(TopicList... topicLists) { uuids = new UUID[topicLists.length]; for (int i = 0; i < uuids.length; i++) { uuids[i] = topicLists[i].getUuid(); } } @Override public int hashCode() { return Arrays.hashCode(uuids); } @Override public boolean equals(Object other) { if (other == this) return true; if (other instanceof TopicListCacheKey) return Arrays.equals(uuids, ((TopicListCacheKey) other).uuids); return false; } } I figure that there are 2^128 different UUIDs and I will probably have at most around 1,000,000 TokenList objects active in the application at any time. Given this, and the fact that the UUIDs are used combinatorially in cache keys, it seems that the chances of this producing the wrong result are vanishingly small. Nevertheless, I feel uneasy about going ahead with it as it just feels 'dirty'. Are there any reasons I should not use this system? Will the performance costs of the SecureRandom used by UUID.randomUUID() outweigh the gains (especially since I expect multiple threads to be doing this at the same time)? Are collisions going to be more likely than I think? Basically, is there anything wrong with doing it this way?? Thanks.

    Read the article

  • How to install clonezilla on Ubuntu 9.10?

    - by Fantomas
    I installed latest Ubuntu on a flash drive. Was ready to install clonezilla with apt-get install clonezilla but that's not the way to do it? I prefer a physical install to a live one because I then have a chance to load specialized ntfs drivers for instance, should there be a need. How can I do this? Thank you!

    Read the article

  • How to unblock outgoing HTTP and HTTPS traffic in iptables?

    - by EApubs
    With the following iptable rules, I was unable to do an apt update and ping a website. Whats wrong with the rules? How to fix it? What is the exact rule to fix it? Chain INPUT (policy ACCEPT) target prot opt source destination ACCEPT tcp -- anywhere anywhere tcp dpt:325 DROP all -- anywhere anywhere Chain FORWARD (policy DROP) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination

    Read the article

  • Redmine installation on Ubuntu ... now what?

    - by Jamie
    I've been tinkering/testing with Ubuntu Server 10.04 Beta LAMP stack in a VM and now I've come to the Redmine install. I found a package for it, and issued: sudo tasksel install lamp-server sudo apt-get install redmine Which (I think almost) worked, but I've no idea how to test it, or even know if it's configured. How do I test it? I'm using 10.04 server so I don't have a local GUI.

    Read the article

  • How to reduce Entity Framework 4 query compile time?

    - by Rup
    Summary: We're having problems with EF4 query compilation times of 12+ seconds. Cached queries will only get us so far; are there any ways we can actually reduce the compilation time? Is there anything we might be doing wrong we can look for? Thanks! We have an EF4 model which is exposed over the WCF services. For each of our entity types we expose a method to fetch and return the whole entity for display / edit including a number of referenced child objects. For one particular entity we have to .Include() 31 tables / sub-tables to return all relevant data. Unfortunately this makes the EF query compilation prohibitively slow: it takes 12-15 seconds to compile and builds a 7,800-line, 300K query. This is the back-end of a web UI which will need to be snappier than that. Is there anything we can do to improve this? We can CompiledQuery.Compile this - that doesn't do any work until first use and so helps the second and subsequent executions but our customer is nervous that the first usage shouldn't be slow either. Similarly if the IIS app pool hosting the web service gets recycled we'll lose the cached plan, although we can increase lifetimes to minimise this. Also I can't see a way to precompile this ahead of time and / or to serialise out the EF compiled query cache (short of reflection tricks). The CompiledQuery object only contains a GUID reference into the cache so it's the cache we really care about. (Writing this out it occurs to me I can kick off something in the background from app_startup to execute all queries to get them compiled - is that safe?) However even if we do solve that problem, we build up our search queries dynamically with LINQ-to-Entities clauses based on which parameters we're searching on: I don't think the SQL generator does a good enough job that we can move all that logic into the SQL layer so I don't think we can pre-compile our search queries. This is less serious because the search data results use fewer tables and so it's only 3-4 seconds compile not 12-15 but the customer thinks that still won't really be acceptable to end-users. So we really need to reduce the query compilation time somehow. Any ideas? Profiling points to ELinqQueryState.GetExecutionPlan as the place to start and I have attempted to step into that but without the real .NET 4 source available I couldn't get very far, and the source generated by Reflector won't let me step into some functions or set breakpoints in them. The project was upgraded from .NET 3.5 so I have tried regenerating the EDMX from scratch in EF4 in case there was something wrong with it but that didn't help. I have tried the EFProf utility advertised here but it doesn't look like it would help with this. My large query crashes its data collector anyway. I have run the generated query through SQL performance tuning and it already has 100% index usage. I can't see anything wrong with the database that would cause the query generator problems. Is there something O(n^2) in the execution plan compiler - is breaking this down into blocks of separate data loads rather than all 32 tables at once likely to help? Setting EF to lazy-load didn't help. I've bought the pre-release O'Reilly Julie Lerman EF4 book but I can't find anything in there to help beyond 'compile your queries'. I don't understand why it's taking 12-15 seconds to generate a single select across 32 tables so I'm optimistic there's some scope for improvement! Thanks for any suggestions! We're running against SQL Server 2008 in case that matters and XP / 7 / server 2008 R2 using RTM VS2010.

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >