Search Results

Search found 3721 results on 149 pages for 'postgresql newbie'.

Page 77/149 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • How to bind a double precision using psycopg2

    - by user337636
    I'm trying to bind a float to a postgresql double precision using psycopg2. ele = 1.0/3.0 dic = {'name': 'test', 'ele': ele} sql = '''insert into waypoints (name, elevation) values (%(name)s, %(ele)s)''' cur = db.cursor() cur.execute(sql, dic) db.commit() sql = """select elevation from waypoints where name = 'test'""" cur.execute(sql_out) ele_out = cur.fetchone()[0] ele_out 0.33333333333300003 ele 0.33333333333333331 Obviously I don't need the precision, but I would like to be able to simply compare the values. I could use the struct module and save it as a string, but thought there should be a better way. Thanks

    Read the article

  • Database schema for simple stats project

    - by Bubnoff
    Backdrop: I have a file hierarchy of cvs files for multiple locations named by dates they cover ...by month specifically. Each cvs file in the folder is named after the location. eg', folder name: 2010-feb contains: location1.csv location2.csv Each CSV file holds records like this: 2010-06-28, 20:30:00 , 0 2010-06-29, 08:30:00 , 0 2010-06-29, 09:30:00 , 0 2010-06-29, 10:30:00 , 0 2010-06-29, 11:30:00 , 0 meaning of record columns ( column names ): Date, time, # of sessions I have a perl script that pulls the data from this mess and originally I was going to store it as json files, but am thinking a database might be more appropriate long term ...comparing year to year trends ...fun stuff like that. Pt 2 - My question/problem: So I now have a REST service that coughs up json with a test database. My question is [ I suck at db design ], how best to design a database backend for this? I am thinking the following tables would suffice and keep it simple: Location: (PK)location_code, name session: (PK)id, (FK)location_code, month, hour, num_sessions I need to be able to average sessions (plus min and max) for each hour across days of week in addition to days of week in a given month or months. I've been using perl hashes to do this and am trying to decide how best to implement this with a database. Do you think stored procedures should be used? As to the database, depending on info gathered here, it will be postgresql or sqlite. If there is no compelling reason for postgresql I'll stick with sqlite. How and where should I compare the data to hours of operation. I am storing the hours of operation in a yaml file. I currently 'match' the hour in the data to a hash from the yaml to do this. Would a database open simpler methods? I am thinking I would do this comparison as I do now then insert the data. Can be recalled with: SELECT hour, num_sessions FROM session WHERE location_code=LOC1 Since only hours of operation are present, I do not need to worry about it. Should I calculate all results as I do now then store as a stats table for different 'reports'? This, rather than processing on demand? How would this look? Anyway ...I ramble. Thanks for reading! Bubnoff

    Read the article

  • CNAME - how will the url be in the http request

    - by Traveller
    A newbie question regarding dns records Let's say I've configured, abc.example.com - A 10.x.x.x and a CNAME for xyz.example.com CNAME for xyz.example.com - abc.example.com when a user does an http request for xyz.example.com what happens when the request reach the 10.x.x.x server. Will the URL be abc.example.com or xyz.example.com? (trying to find out if virtual host in apache need to be updated) Thanks much

    Read the article

  • Notepad Merge 2 lines into 1 line

    - by Kalman Mettler
    Sorry for my rough English, I try to visualize my question. I have two lists of words, one per line, each list in a separate file: File 1: white fehér green zöld red piros File 2: white blanco green verde red roja I need to combine these lists, removing any duplicates and create a new file containing the following: fehér blanco zöld verde piros roja I am a newbie with Notepad++ and can't work out this problem.

    Read the article

  • Creating application using rails 2.3.5 and cassandra database

    - by Joshi
    hi all, Pls guide me how to create rails application using rails 2.3.5 and cassandra database as rails 2.3.5 supports mysql, sqllite etc. I typed in the command prompt like this $ rails -d cassandra myapp Databases supported for preconfiguration are: mysql, oracle, postgresql, sqlite2, sqlite3, frontbase, ibm_db So pls help me in this regard

    Read the article

  • Xen command xl doesn't create a vm but xend/xm does

    - by ineff
    I'm a newbie to Xen, and I've recently installed Xen 4.2 by sources on my system. I've found a strange thing I've a VM when I start it via the command "xm create machine.cfg" all work fine, but if I use "xl create machine.cfg" it gives me the following error xc: error: panic: xc_dom_core.c:442: xc_dom_alloc_segment: segment ramdisk too large (0x4ba 0x2000 - 0x1bd9 pages): Out of memory libxl: error: libxl_dom.c:208:libxl__build_pv xc_dom_build_image failed: Invalid argument cannot (re-)build domain: -3 xenconsole: Could not read tty from store: No such file or directory What could be the problem? Any idea?

    Read the article

  • DISTINCT clause in SQLite

    - by Eye of Hell
    Hello. Recently i found that SQLite don't support DISTINCT ON() clause that seems postgresql-specific. For exeample, if i have table t with columns a and b. And i want to select all items with distinct b. Is the following query the only one and correct way to do so in SQLite? select * from t where b in (select distinct b from t)

    Read the article

  • Self Ad serving for Linux server

    - by protecttheweb
    I am looking for ad serving software to serve my own ads. I am a newbie and just want your recommendation. Yes, I know about Google DFP but I want something non Google or something open source or for Linux servers. I want something kind of automatic like advertisers add the banner images or test ads and pay and ads are automatically served or can be set to draft until set to live. What recommendation do you have?

    Read the article

  • Connect to vmware virtual machine via network

    - by Arnis L.
    I want to connect to my vmware vm from home. To work network i'm connecting through VPN. VM sits on one of the workstations (can connect to it through RDC). Any tips how to do that? What software do i need (got VMWare workstation atm)? p.s. i'm quite a newbie at this.

    Read the article

  • Where does Chrome store its bookmarks in Ubuntu 11.10?

    - by Alan Wood
    I looked at all the other posts on this but can't find the directories mentioned (~/.config/google-chrome/Default/Bookmarks, it's a JSON file.). Being a 2 day Newbie to Ubuntu/Linux I would like to know if the location has changed in the latest version or if not how I locate the directory indicated. I have logged in as root and searched for the folder and can't find it although I imported my bookmarks from a html file so I know that they must be saved somewhere.

    Read the article

  • Can AutoCAD entities be serialized?

    - by billmuell
    Using ObjectARX (C++) for AutoCAD 2010, can AutoCAD entities be serialized? We need to save the serialized entity in a field in a database (Oracle, PostgreSQL, etc., not AcDbDatabase). It'OK if you show me how to save them in disk, something like this: AcDbEntity * entity; ... std::ofstream ofs("c:\\filename.fil", std::ios::binary); ofs.write((char *)(entity), sizeof(entity)); ofs.close(); Thanks

    Read the article

  • How to automatically reseed after using identity_insert?

    - by Earlz
    Hello, I recently migrated from a PostgreSQL database to a SQL Server database. To switch the data over I had to enable IDENTITY_INSERT. Well come to find out that I get all sorts of strange errors due to duplicate identity values(which are set as primary keys) upon doing an insert in any of the tables. I have quite a few tables. What would be the easiest way of automatically reseeding the identity of every table so that it is after max(RID)?

    Read the article

  • What does "Flush the Firewall" mean?

    - by Qasim
    I know this is a real newbie question but what does it mean when someone says they "flushed the firewall". I got locked out of my server a few times due to the enhanced security configuration I had done and when I contacted my server management company, they said both times that they flushed the firewall and I was allowed back in. I hope "flushing the firewall" doesn't mean they reduced the security settings at all.

    Read the article

  • Is there any certification in simple SQL

    - by Mirage
    I want to do sql certification but specific like mysql , postgresql , mSsql . Is there any simple sql certification If not which one would be good to do for company level. which covers all topics I am thinking of going in data warehousing, if that helps.

    Read the article

  • Grails problem with nullable contraint in domain class

    - by xain
    Hi, I'm having the following problem with grails' 1.2.1 domain classes: When I set a constraint attr(nullable:true) and attr is int or bool, this condition isn't reflected in the db (postgresql 8.4). However, if attr is a String, the DB is consistent with the situation. Any hints ? Thanks

    Read the article

  • Cannot install Nvidia driver X server

    - by Negoti Leboti
    I downloaded NVIDIA-Linux-x86-295.59.run from the official Nvidia website, I used in the terminal sudo sh NVIDIA-Linux-x86-295.59.run the installation started and everything, but I got this ERROR: You appear to be running an X server; please exit X before installing. For further details, please see the section INSTALLING THE NVIDIA DRIVER in the README available on the Linux driver download page at www.nvidia.com. I'm a newbie to ubuntu, and I don't know so much codes, can you please tell me step by step?

    Read the article

  • WPF Application with Database.

    - by mike
    Hi, i would like to or need to use a database for my wpf project. It has to store "person" "team" "goals" and maybe 2 more things, nothing very big. Ive already used (worked) with databases in java / php (postgresql), but is there maybe an "easier" way to store the things.. i mean if the db is going to be big than i could use (postgre or mysql), but this one would be small.

    Read the article

  • apache+mod_wsgi configuration for django project(s) on a quad core

    - by Stefano
    I've been experiment quite some time with a "typical" django setting upon nginx+apache2+mod_wsgi+memcached(+postgresql) (reading the doc and some questions on SO and SF, see comments) Since I'm still unsatisfied with the behavior (definitely because of some bad misconfiguration on my part) I would like to know what a good configuration would look like with these hypotesis: Quad-Core Xeon 2.8GHz 8 gigs memory several django projects (anything special related to this?) These are excerpts form my current confs: apache2 SetEnv VHOST null #WSGIPythonOptimize 2 <VirtualHost *:8082> ServerName subdomain.domain.com ServerAlias www.domain.com SetEnv VHOST subdomain.domain AddDefaultCharset UTF-8 ServerSignature Off LogFormat "%{X-Real-IP}i %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" custom ErrorLog /home/project1/var/logs/apache_error.log CustomLog /home/project1/var/logs/apache_access.log custom AllowEncodedSlashes On WSGIDaemonProcess subdomain.domain user=www-data group=www-data threads=25 WSGIScriptAlias / /home/project1/project/wsgi.py WSGIProcessGroup %{ENV:VHOST} </VirtualHost> wsgi.py import os import sys # setting all the right paths.... _realpath = os.path.realpath(os.path.dirname(__file__)) _public_html = os.path.normpath(os.path.join(_realpath, '../')) sys.path.append(_realpath) sys.path.append(os.path.normpath(os.path.join(_realpath, 'apps'))) sys.path.append(os.path.normpath(_public_html)) sys.path.append(os.path.normpath(os.path.join(_public_html, 'libs'))) sys.path.append(os.path.normpath(os.path.join(_public_html, 'django'))) os.environ['DJANGO_SETTINGS_MODULE'] = 'settings' import django.core.handlers.wsgi _application = django.core.handlers.wsgi.WSGIHandler() def application(environ, start_response): """ Launches django passing over some environment (domain name) settings """ application_group = environ['mod_wsgi.application_group'] """ wsgi application group is required. It's also used to generate the HOST.DOMAIN.TLD:PORT parameters to pass over """ assert application_group fields = application_group.replace('|', '').split(':') server_name = fields[0] os.environ['WSGI_APPLICATION_GROUP'] = application_group os.environ['WSGI_SERVER_NAME'] = server_name if len(fields) > 1 : os.environ['WSGI_PORT'] = fields[1] splitted = server_name.rsplit('.', 2) assert splitted >= 2 splited.reverse() if len(splitted) > 0 : os.environ['WSGI_TLD'] = splitted[0] if len(splitted) > 1 : os.environ['WSGI_DOMAIN'] = splitted[1] if len(splitted) > 2 : os.environ['WSGI_HOST'] = splitted[2] return _application(environ, start_response)` folder structure in case it matters (slightly shortened actually) /home/www-data/projectN/var/logs /project (contains manage.py, wsgi.py, settings.py) /project/apps (all the project ups are here) /django /libs Please forgive me in advance if I overlooked something obvious. My main question is about the apache2 wsgi settings. Are those fine? Is 25 threads an /ok/ number with a quad core for one only django project? Is it still ok with several django projects on different virtual hosts? Should I specify 'process'? Any other directive which I should add? Is there anything really bad in the wsgi.py file? I've been reading about potential issues with the standard wsgi.py file, should I switch to that? Or.. should this conf just be running fine, and I should look for issues somewhere else? So, what do I mean by "unsatisfied": well, I often get quite high CPU WAIT; but what is worse, is that relatively often apache2 gets stuck. It just does not answer anymore, and has to be restarted. I have setup a monit to take care of that, but it ain't a real solution. I have been wondering if it's an issue with the database access (postgresql) under heavy load, but even if it was, why would the apache2 processes get stuck? Beside these two issues, performance is overall great. I even tried New Relic and got very good average results.

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >