Search Results

Search found 60391 results on 2416 pages for 'data generation'.

Page 709/2416 | < Previous Page | 705 706 707 708 709 710 711 712 713 714 715 716  | Next Page >

  • How can I optimize the import of this dataset in mysql?

    - by GeoffreyF67
    I've got the following table schema: CREATE TABLE `alexa` ( `id` int(10) unsigned NOT NULL, `rank` int(10) unsigned NOT NULL, `domain` varchar(63) NOT NULL, `domainStatus` varchar(6) DEFAULT NULL, PRIMARY KEY (`rank`), KEY `domain` (`domain`), KEY `id` (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 It takes several minutes to import the data. To me that seems rather slow as we're only talking about a million rows of data. What can I do to optimize the insert of this data? (already using disable keys) G-Man

    Read the article

  • How to cache YUI DataSource?

    - by Matt McCormick
    I'm setting up a YUI DataTable with filtering by following the steps on the YUI site However, I am using JSON as the DataSource ResponseType. When I type in a value to filter, the request will be sent to the server again. I find this to be wasteful as all the data has already been retrieved the first time. Is there a way to cache the initial data returned and then filter only according to that data so another AJAX request does not have to be made?

    Read the article

  • mysql innodb max size of transaction

    - by chris
    Using mysql 5.1.41 and innodb I'm doing some data import, but can't use load data infile, so I'm manually issuing insert statements. I found that it's much faster to disable auto commit and issue say, 100 insert statements and then commit, instead of the implicit commit after each insert. It got me thinking, what limits are there to how much data I can put into a transaction? Is there a limit on the number of statements, or does it have to do with the size in bytes etc...?

    Read the article

  • create a text table with php using fixed width font

    - by Hintswen
    I want to write a PHP script to output some data as a plain-text table using spaces to get the data into columns (just like the Linux top command). I can't use a HTML table as the script output will be saved to disk and viewed in a plain-text editor. Is there anything available that can do this automatically (format the data into columns)?

    Read the article

  • Interfacing with Compris POS

    - by Sargun Dhillon
    Does anyone have any data on how to interface with a Compris POS? I have a Compris POS, and I need to grab data from the database. I can't get information from NCR regarding the underlying data format, and I was wondering if anyone had reverse engineered the device, or had any documentation on the device.

    Read the article

  • Nice, clean, simple way of getting a dataset from ASP.NET to plain HTML jQuery or JavaScript library

    - by David S
    I know this is a probable open ended question, and I have tried looking around so much over the last year or two... maybe I am looking for a perfect place that doesn't exist! of course it's all about perception no less.. Anyway, just to clarify what I am trying to do and why: I want to be able to use (primarily for the moment) ASP.NET or services thereof to get a dataset - whatever the source data, I can obviously get a dataset of rows/Columns. I want to be able to, as simply as possible, get that data over to the client via xml/json/whatever, to then use in a "variety" of ways. "Variety" of ways meaning I would like to "easily" bind that data to say a grid, or a combo dropdown or just simply render to a textbox - BUT by referencing the dataset as I would say on the serverside. Now I know this all sounds simplistic, and I know there are lots of complications.. so I have tried the following so far over the last year or so: ExtJS - very good, nice solid framework, but just found it a bit too much to use in everyday basic apps - great if I was building a whole application with it Yahoo YUI - not looked recently, but I guess some of the concepts with ExtJS were similar? JQuery - of course to get data etc, it was ok, and I guess there are so many 3rd party plugins, that a mix and match might work? Adobe SPRY - ironically this was as close to getting a dataset style structure to Javascript/client, although it seemed to drop off/go quiet..? I maybe wrong.. I did have a very cursory play with Tibco GI and another one I cannot remember the name of! but again, it felt like it was great to build a whole app perhaps? Anyway, I am very amazed by all of the technologies coming out, and really not biased one way or the other, I really just want a very simple way of getting data from the server, and having a basic/very flexible way of working with that data in the client without using server technologies.. I need to keep the server flexible as I may need to use PHP, or java technologies not just .NET So again, sorry for the rambles, but if anyone out there has had a simple experience, or would like to share some ideas, it would be very welcomed!! David.

    Read the article

  • populate textboxs with xml node attributes

    - by Doug
    I have Data.xml: <?xml version="1.0" encoding="utf-8" ?> <data> <album> <slide title="Autum Leaves" description="Leaves from the fall of 1986" source="images/Autumn Leaves.jpg" thumbnail="images/Autumn Leaves_thumb.jpg" /> <slide title="Creek" description="Creek in Alaska" source="images/Creek.jpg" thumbnail="images/Creek_thumb.jpg" /> </album> </data> I'd like to be able to edit the attributes of each Slide node via GridView (that has a "Select" column added.) And so far I have: protected void GridView1_SelectedIndexChanged(object sender, EventArgs e) { int selectedIndex = GridView1.SelectedIndex; LoadXmlData(selectedIndex); } private void LoadXmlData(int selectedIndex) { XmlDocument xmldoc = new XmlDocument(); xmldoc.Load(MapPath(@"..\photo_gallery\Data.xml")); XmlNodeList nodelist = xmldoc.DocumentElement.ChildNodes; XmlNode xmlnode = nodelist.Item(selectedIndex); titleTextBox.Text = xmlnode.Attributes["title"].InnerText; descriptionTextBox.Text = xmlnode.Attributes["description"].InnerText; sourceTextBox.Text = xmlnode.Attributes["source"].InnerText; thumbTextBox.Text = xmlnode.Attributes["thumbnail"].InnerText; The code for LoadXmlData is just a guess on my part - I'm new to working with xml in this way. I'd like have the user to slected the row from the gridview, then populate a set of text boxes with each slide attributed for updating back to the Data.xml file. The error I'm getting is "Object reference not set to an instance of an object" at the line: titleTextBox.Text = xmlnode.Attributes["@title"].InnerText; - so I'm not reaching the attribute "title" of the slide node. Thanks for any ideas you may have.

    Read the article

  • Testing ASP.NET security in Firefox

    - by blahblah
    I'm not sure whether this question belongs on StackOverflow or SuperUser, but here goes nothing... I'm trying to test out some basic security problems on my personal ASP.NET website to see exactly how the custom validators, etc. work when tampering with the data. I've been looking at the Firefox extension TamperData which seems to do the trick, but it doesn't feel very professional at all. The issues I'm having with TamperData is that the textbox for the POST data is way too small to hold the ASP.NET view-state, so I have to copy that data into Emacs and then back again to be productive at all. I also don't like that there doesn't seem to be an option to only tamper with data which is from/to localhost. Any ideas on better extensions for the task or better methods to test it?

    Read the article

  • getjson and servers

    - by gheil.apl
    Is getJSON only useful for those who control a server? Is there any other alternative for getting data from a file? i tried $.getJSON("good.json", function(data){ out=out+"good.json: "+data + ","; }); where good.json = {"a":"1","b":"2"} and got a result of 'null' for data. These all are valid JSON files, and all give null when used in the above: good.htm assoc.json assoc.js stub.json stub.js test.js test.txt and all get a null result... The above is in an interactive setting at http://jsbin.com/dbJSON/8/edit the output (of null) is to be had by clicking 'output'.

    Read the article

  • Extracting Certain XML Elements with PHP SimpleXML

    - by Peter
    I am having some problems parsing this piece of XML using SimpleXML. There is always only one Series element, and a variable number of Episode elements beneath. I want to parse XML so I can store the Series data in one table, and all the Episode data in another table. XML: <Data> <Series> <id>80348</id> <Genre>|Action and Adventure|Comedy|Drama|</Genre> <IMDB_ID>tt0934814</IMDB_ID> <SeriesID>68724</SeriesID> <SeriesName>Chuck</SeriesName> <banner>graphical/80348-g.jpg</banner> </Series> <Episode> <id>935481</id> <Director>Robert Duncan McNeill</Director> <EpisodeName>Chuck Versus the Third Dimension 2D</EpisodeName> <EpisodeNumber>1</EpisodeNumber> <seasonid>27984</seasonid> <seriesid>80348</seriesid> </Episode> <Episode> <id>935483</id> <Director>Robert Duncan McNeill</Director> <EpisodeName>Buy More #15: Employee Health</EpisodeName> <EpisodeNumber>2</EpisodeNumber> <seasonid>27984</seasonid> <seriesid>80348</seriesid> </Episode> </Data> When I attempt to access just the first Series element and child nodes, or iterate through the Episode elements only it does not work. I have also tried to use DOMDocument with SimpleXML, but could not get that to work at all. PHP Code: <?php if(file_exists('en.xml')) { $data = simplexml_load_file('en.xml'); foreach($data as $series) { echo 'id: <br />' . $series->id; echo 'imdb: <br />' . $series->IMDB_ID; } } ?> Output: id:80348 imdb:tt0934814 id:935481 imdb: id:1534641 imdb: Any help would be greatly appreciated.

    Read the article

  • how to change file permissions in postfix ?

    - by Zdanozdan
    Hi, In the postfix main.cf I have configured service to invoke external php script. smtp inet n - - - - smtpd -o content_filter=myservice:www-data myservice unix - n n - 1 pipe flags=Rq user=me null_sender= argv=/home/me/my_script.php so far so good, my_script.php is executed. It creates the file my_file.txt in home dir. However I can only manage -rw------- 1 me www-data 16 2009-10-11 19:35 my_file.txt How do I add 'r' permissions for www-data group ?

    Read the article

  • Sanitizing user input that will later be e-mailed - what should I be worried about?

    - by Kevin Burke
    I'm interning for an NGO in India (Seva Mandir, http://sevamandir.org) and trying to fix their broken "subscribe to newsletter" box. Because the staff isn't very sophisticated and our web host isn't great, I decided to send the relevant data to the publications person via mail() instead of storing it in a MySQL database. I know that it's best to treat user input as malicious, and I've searched the SO forums for posts relevant to escaping user data for sending in a mail message. I know the data should be escaped; what should I be worried about and what's the best way to sanitize the input before emailing it? Form flow: 1. User enters email on homepage and clicks Submit 2. User enters name, address, more information on second page (bad usability, I know, but my boss asked me to) and clicks "Submit" 3. Collect the data via $_POST and email it to the publications editor (and possibly send a confirmation to the subscriber). I am going to sanitize the email in step 2 and the other data in step 3. I appreciate your help, Kevin

    Read the article

  • How can I append text to a start and end of a file as header and footer ?

    - by Morano88
    I'm filling an XML document manually using C# and I need to have <data> as header and </data> as footer for the whole XML file. Is there an easy way to do that ? I know It can be done but I couldn't find a way to do it. Keep in mind that I'm updating the entries so I need to make sure that they always come between the header and the footer. Thanks Example <data> Entry New Entry 1 New Entry 2 </data>

    Read the article

  • The confusion on python encoding

    - by zhangzhong
    I retrieved the data encoded in big5 from database,and I want to send the data as email of html content, the code is like this: html += """<tr><td>""" html += """unicode(rs[0], 'big5')""" # rs[0] is data encoded in big5 I run the script, but the error raised: UnicodeDecodeError: 'ascii' codec can't decode byte...... However, I tried the code in interactive python command line, there are no errors raised, could you give me the clue?

    Read the article

  • Views performance in MySQL

    - by Gianluca Bargelli
    I am currently writing my truly first PHP Application and i would like to know how to project/design/implement MySQL Views properly; In my particular case User data is spread across several tables (as a consequence of Database Normalization) and i was thinking to use a View to group data into one large table: CREATE VIEW `Users_Merged` ( name, surname, email, phone, role ) AS ( SELECT name, surname, email, phone, 'Customer' FROM `Customer` ) UNION ( SELECT name, surname, email, tel, 'Admin' FROM `Administrator` ) UNION ( SELECT name, surname, email, tel, 'Manager' FROM `manager` ); This way i can use the View's data from the PHP app easily but i don't really know how much this can affect performance. For example: SELECT * from `Users_Merged` WHERE role = 'Admin'; Is the right way to filter view's data or should i filter BEFORE creating the view itself? (I need this to have a list of users and the functionality to filter them by role).

    Read the article

  • Error in glmmadmb(.....) The function maximizer failed (couldn't find STD file)

    - by Joe King
    This works fine: fit.mc1 <-MCMCglmm(bull~1,random=~school,data=dt1,family="categorical", prior=list(R=list(V=1, fix=1), G=list(G1=list(V=1, nu=0))), slice=T) So does this: fit.glmer <- glmer(bull~(1|school),data=dt1,family=binomial) But now I am trying to work with the package glmmadmb and this does not work: fit.mc12 <- glmmadmb(bull~1+(1|school), data=dt1, family="binomial", mcmc=TRUE, mcmc.opts=mcmcControl(mcmc=50000)) It generates the error: Error in glmmadmb(bull~ 1 + (1 | school), data = dt1, family = "binomial", : The function maximizer failed (couldn't find STD file) In addition: Warning message: running command '<snip>\cmd.exe <snip>\glmmadmb.exe" -maxfn 500 -maxph 5 -noinit -shess -mcmc 5000 -mcsave 5 -mcmult 1' had status 1

    Read the article

  • How do I print out objects in an array in python?

    - by Jonathan
    I'm writing a code which performs a k-means clustering on a set of data. I'm actually using the code from a book called collective intelligence by O'Reilly. Everything works, but in his code he uses the command line and i want to write everything in notepad++. As a reference his line is >>>kclust=clusters.kcluster(data,k=10) >>>[rownames[r] for r in k[0]] Here is my code: from PIL import Image,ImageDraw def readfile(filename): lines=[line for line in file(filename)] # First line is the column titles colnames=lines[0].strip( ).split('\t')[1:] rownames=[] data=[] for line in lines[1:]: p=line.strip( ).split('\t') # First column in each row is the rowname rownames.append(p[0]) # The data for this row is the remainder of the row data.append([float(x) for x in p[1:]]) return rownames,colnames,data from math import sqrt def pearson(v1,v2): # Simple sums sum1=sum(v1) sum2=sum(v2) # Sums of the squares sum1Sq=sum([pow(v,2) for v in v1]) sum2Sq=sum([pow(v,2) for v in v2]) # Sum of the products pSum=sum([v1[i]*v2[i] for i in range(len(v1))]) # Calculate r (Pearson score) num=pSum-(sum1*sum2/len(v1)) den=sqrt((sum1Sq-pow(sum1,2)/len(v1))*(sum2Sq-pow(sum2,2)/len(v1))) if den==0: return 0 return 1.0-num/den class bicluster: def __init__(self,vec,left=None,right=None,distance=0.0,id=None): self.left=left self.right=right self.vec=vec self.id=id self.distance=distance def hcluster(rows,distance=pearson): distances={} currentclustid=-1 # Clusters are initially just the rows clust=[bicluster(rows[i],id=i) for i in range(len(rows))] while len(clust)>1: lowestpair=(0,1) closest=distance(clust[0].vec,clust[1].vec) # loop through every pair looking for the smallest distance for i in range(len(clust)): for j in range(i+1,len(clust)): # distances is the cache of distance calculations if (clust[i].id,clust[j].id) not in distances: distances[(clust[i].id,clust[j].id)]=distance(clust[i].vec,clust[j].vec) #print 'i' #print i #print #print 'j' #print j #print d=distances[(clust[i].id,clust[j].id)] if d<closest: closest=d lowestpair=(i,j) # calculate the average of the two clusters mergevec=[ (clust[lowestpair[0]].vec[i]+clust[lowestpair[1]].vec[i])/2.0 for i in range(len(clust[0].vec))] # create the new cluster newcluster=bicluster(mergevec,left=clust[lowestpair[0]], right=clust[lowestpair[1]], distance=closest,id=currentclustid) # cluster ids that weren't in the original set are negative currentclustid-=1 del clust[lowestpair[1]] del clust[lowestpair[0]] clust.append(newcluster) return clust[0] def kcluster(rows,distance=pearson,k=4): # Determine the minimum and maximum values for each point ranges=[(min([row[i] for row in rows]),max([row[i] for row in rows])) for i in range(len(rows[0]))] # Create k randomly placed centroids clusters=[[random.random( )*(ranges[i][1]-ranges[i][0])+ranges[i][0] for i in range(len(rows[0]))] for j in range(k)] lastmatches=None for t in range(100): print 'Iteration %d' % t bestmatches=[[] for i in range(k)] # Find which centroid is the closest for each row for j in range(len(rows)): row=rows[j] bestmatch=0 for i in range(k): d=distance(clusters[i],row) if d<distance(clusters[bestmatch],row): bestmatch=i bestmatches[bestmatch].append(j) # If the results are the same as last time, this is complete if bestmatches==lastmatches: break lastmatches=bestmatches # Move the centroids to the average of their members for i in range(k): avgs=[0.0]*len(rows[0]) if len(bestmatches[i])>0: for rowid in bestmatches[i]: for m in range(len(rows[rowid])): avgs[m]+=rows[rowid][m] for j in range(len(avgs)): avgs[j]/=len(bestmatches[i]) clusters[i]=avgs return bestmatches

    Read the article

  • Returning content directly from memcache - Django / HTTP Server

    - by RadiantHex
    Hi folks, I've built a web application with Django, I'm using Memcached in order to cache data. A few views cache the whole HttpResponse objects, therefore there might be a better alternative for returning the Memcached data other than going through Django. What could be faster alternatives for returning Memcached data on HTTP Requests ? I'm trying to make the operation as fast and lightweight as possible. Help would be much appreciated! :)

    Read the article

  • Problem in loading cities using JSON

    - by Saravanan I M
    I am using Geonames for loading the cities using JSON. Geonames data i imported into my database. I am using MS SQL 2008 Server. I display a dropdown for country select. Once the user select the country. I am loading all the cities into the autocomplete textbox. I am facing a delay in the getJSON method. Also it seems like executing asynchronous. So before getting all the data my autocompelte is getting filled. Below is my complete script. I think i have some problem in the loop. Please advice me what i am doing wrong in my code. $(document).ready(function () { $("#ShowLoad").hide(); //Hook onto the MakeID list's onchange event $("#Country").change(function () { findcities = []; $("#ShowLoad").show(); $("#HomeTown").unautocomplete(); var url = '<%= Url.Content("~/") %>' + "Location/GetCitiesCountByCountry/" + $("#Country").val(); $.getJSON(url, null, function (data) { var total = data; if (total > 0) { var pageTotal = Math.ceil(total / 1000); var isFilled = false; for (var i = 0; i < pageTotal; i++) { var skip = i == 1 ? 0 : (i * 1000) - 1000; var url = '<%= Url.Content("~/") %>' + "Location/GetCitiesByCountry/" + $("#Country").val() + "?skip=" + skip; //alert(i); $.getJSON(url, null, function (data) { $.each(data.Cities, function (index, optionData) { if ($("#Country").val() == "US") { findcities.push(optionData.asciiname + "," + optionData.admin1_code); } else { findcities.push(optionData.asciiname); } }); if (i == pageTotal) { //alert(findcities); $("#ShowLoad").hide(); $("#HomeTown").focus().autocomplete(findcities); } }); } $("#HomeTown").setOptions({ max: 100000 }); } }); }).change(); });

    Read the article

  • Using multiple named outlets and a wrapper view with no content in Emberjs

    - by user1889776
    I'm trying to use multiple named outlets with Ember.js. Is my approach below correct? Markup: <script type="text/x-handlebars" data-template-name="application"> <div id="mainArea"> {{outlet main_area}} </div> </script> <script type="text/x-handlebars" data-template-name="home"> <ul id="sections"> {{outlet sections}} </ul> <ul id="categories"> {{outlet categories}} </ul> </script> <script type="text/x-handlebars" data-template-name="sections"> {{#each section in controller}} <li><img {{bindAttr src="section.image"}}></li> {{/each}} </script> <script type="text/x-handlebars" data-template-name="categories"> {{#each category in controller}} <img {{bindAttr src="category.image"}}> {{/each}} </script>? JS Code: Here I set the content of the various controllers to data grabbed from a server and connect outlets with their corresponding views. Since the HomeController has no content, set its content to an empty object - a hack to get the rid of this error message: Uncaught Error: assertion failed: Cannot delegate set('categories' ) to the 'content' property of object proxy : its 'content' is undefined. App.Router = Ember.Router.extend({ enableLogging: false, root: Ember.Route.extend({ index: Ember.Route.extend({ route: '/', connectOutlets: function(router){ router.get('sectionsController').set('content',App.Section.find()); router.get('categoriesController').set('content', App.Category.find()); router.get('applicationController').connectOutlet('main_area', 'home'); router.get('homeController').connectOutlet('home', {}); router.get('homeController').connectOutlet('categories', 'categories'); router.get('homeController').connectOutlet('sections', 'sections'); } }) }) });

    Read the article

  • R: Why does read.table stop reading a file?

    - by Mike Dewar
    I have a file, called genes.txt, which I'd like to become a data.frame. It's got a lot of lines, each line has three, tab delimited fields: mike$ wc -l genes.txt 42476 genes.txt I'd like to read this file into a data.frame in R. I use the command read.table, like this: genes = read.table( genes_file, sep="\t", na.strings="-", fill=TRUE, col.names=c("GeneSymbol","synonyms","description") ) Which seems to work fine, where genes_file points at genes.txt. However, the number of lines in my data.frame is significantly less than the number of lines in my text file: > nrow(genes) [1] 27896 and things I can find in the text file: mike$ grep "SELL" genes.txt SELL CD62L|LAM1|LECAM1|LEU8|LNHR|LSEL|LYAM1|PLNHR|TQ1 selectin L don't seem to be in the data.frame > grep("SELL",genes$GeneSymbol) integer(0) it turns out that genes = read.delim( genes_file, header=FALSE, na.strings="-", fill=TRUE, col.names=c("GeneSymbol","synonyms","description"), ) works just fine. Why does read.delim work when read.table not? If it's of use, you can recreate genes.txt using the following commands which you should run from a command line curl -O ftp://ftp.ncbi.nlm.nih.gov/gene/DATA/gene_info.gz gzip -cd gene_info.gz | awk -Ft '$1==9606{print $3 "\t" $5 "\t" $9}' > genes.txt be warned, though, that gene_info.gz is 101MBish.

    Read the article

  • Server crash = How does a TCP/IP (and the browser-client) behave after this?

    - by jens
    Hello Experts, i would be thankfull for an explanation what happens with HTTP(TCP/IP) transmissions when the server crashes unexpectedly, how does the client Browser (Firefox / IE) handle this event. What happens in the following two standard cases: Clients-actively sends data: The TCP/IP Connection has been estableshed and the Client (Web-Browser) is Sending a POST Request with some data and in the middle of the process of sending the server crashes. What does this mean for the client? As far as I know TCP/IP does not "acknowledge" a send data-package so the client does not know that the server crashed. How will the client behave? (Firefox and Internet Explorer)? The Server is actively sending data: As above the tcp/ip connection has been established and the Server is sending a large website to the client (browser). In the middle of the sending-process the server crashes, so no futher packets are sent. How does the client browser react to this event (Firefox and Interne Expolrer) Thank you very much!! Jens

    Read the article

  • nginx reverse proxy cannot access apache virtual hosts

    - by Sc0rian
    I am setting up nginx as a reverse proxy. The server runs on directadmin and lamp stack. I have nginx running on port 81. I can access all my sites (including virtual ips) on the port 81. However when I forward the traffic from port 80 to 81, the virtual ips have a message saying "Apache is running normally". Server IPs are fine, and I can still access virtual IP's on 81. [root@~]# netstat -an | grep LISTEN | egrep ":80|:81" tcp 0 0 <virtual ip>:81 0.0.0.0:* LISTEN tcp 0 0 <virtual ip>:81 0.0.0.0:* LISTEN tcp 0 0 <serverip>:81 0.0.0.0:* LISTEN tcp 0 0 :::80 :::* LISTEN apache 24090 0.6 1.3 29252 13612 ? S 18:34 0:00 /usr/sbin/httpd -k start -DSSL apache 24092 0.9 2.1 39584 22056 ? S 18:34 0:00 /usr/sbin/httpd -k start -DSSL apache 24096 0.2 1.9 35892 20256 ? S 18:34 0:00 /usr/sbin/httpd -k start -DSSL apache 24120 0.3 1.7 35752 17840 ? S 18:34 0:00 /usr/sbin/httpd -k start -DSSL apache 24495 0.0 1.4 30892 14756 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24496 1.0 2.1 39892 22164 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24516 1.5 3.6 55496 38040 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24519 0.1 1.2 28996 13224 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24521 2.7 4.0 58244 41984 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24522 0.0 1.2 29124 12672 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24524 0.0 1.1 28740 12364 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24535 1.1 1.7 36008 17876 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24536 0.0 1.1 28592 12084 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24537 0.0 1.1 28592 12112 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24539 0.0 0.0 0 0 ? Z 18:35 0:00 [httpd] <defunct> apache 24540 0.0 1.1 28592 11540 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL apache 24541 0.0 1.1 28592 11548 ? S 18:35 0:00 /usr/sbin/httpd -k start -DSSL root 24548 0.0 0.0 4132 752 pts/0 R+ 18:35 0:00 egrep apache|nginx root 28238 0.0 0.0 19576 284 ? Ss May29 0:00 nginx: master process /usr/local/nginx/sbin/nginx -c /usr/local/nginx/conf/nginx.conf apache 28239 0.0 0.0 19888 804 ? S May29 0:00 nginx: worker process apache 28240 0.0 0.0 19888 548 ? S May29 0:00 nginx: worker process apache 28241 0.0 0.0 19736 484 ? S May29 0:00 nginx: cache manager process here is my nginx conf: cat /usr/local/nginx/conf/nginx.conf user apache apache; worker_processes 2; # Set it according to what your CPU have. 4 Cores = 4 worker_rlimit_nofile 8192; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] ' '"$request" $status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; server_tokens off; access_log /var/log/nginx_access.log main; error_log /var/log/nginx_error.log debug; server_names_hash_bucket_size 64; sendfile on; tcp_nopush on; tcp_nodelay off; keepalive_timeout 30; gzip on; gzip_comp_level 9; gzip_proxied any; proxy_buffering on; proxy_cache_path /usr/local/nginx/proxy_temp levels=1:2 keys_zone=one:15m inactive=7d max_size=1000m; proxy_buffer_size 16k; proxy_buffers 100 8k; proxy_connect_timeout 60; proxy_send_timeout 60; proxy_read_timeout 60; server { listen <server ip>:81 default rcvbuf=8192 sndbuf=16384 backlog=32000; # Real IP here server_name <server host name> _; # "_" is for handle all hosts that are not described by server_name charset off; access_log /var/log/nginx_host_general.access.log main; location / { proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_pass http://<server ip>; # Real IP here client_max_body_size 16m; client_body_buffer_size 128k; proxy_buffering on; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 120; proxy_buffer_size 16k; proxy_buffers 32 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; } location /nginx_status { stub_status on; access_log off; allow 127.0.0.1; deny all; } } include /usr/local/nginx/vhosts/*.conf; } here is my vhost conf: # cat /usr/local/nginx/vhosts/1.conf server { listen <virt ip>:81 default rcvbuf=8192 sndbuf=16384 backlog=32000; # Real IP here server_name <virt domain name>.com ; # "_" is for handle all hosts that are not described by server_name charset off; access_log /var/log/nginx_host_general.access.log main; location / { proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_pass http://<virt ip>; # Real IP here client_max_body_size 16m; client_body_buffer_size 128k; proxy_buffering on; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 120; proxy_buffer_size 16k; proxy_buffers 32 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; } } Apache config: <VirtualHost xxxxxx:80 > ServerName www.<domain>.com ServerAlias www.<domain>.com <domain>.com ServerAdmin webmaster@<domain>.com DocumentRoot /home/<domain>/domains/<domain>.com/public_html ScriptAlias /cgi-bin/ /home/<domain>/domains/<domain>.com/public_html/cgi-bin/ UseCanonicalName OFF <IfModule !mod_ruid2.c> SuexecUserGroup <domain> <domain> </IfModule> <IfModule mod_ruid2.c> RMode config RUidGid <domain> <domain> RGroups apache access </IfModule> CustomLog /var/log/httpd/domains/<domain>.com.bytes bytes CustomLog /var/log/httpd/domains/<domain>.com.log combined ErrorLog /var/log/httpd/domains/<domain>.com.error.log <Directory /home/<domain>/domains/<domain>.com/public_html> Options +Includes -Indexes php_admin_flag engine ON php_admin_value sendmail_path '/usr/sbin/sendmail -t -i -f <domain>@<domain>.com' </Directory> <virtual ip address>:80 is a NameVirtualHost default server www.xx.com (/usr/local/directadmin/data/users/xx/httpd.conf:16) port 80 namevhost www.xx.com (/usr/local/directadmin/data/users/xx/httpd.conf:16) port 80 namevhost www.xx.co.uk (/usr/local/directadmin/data/users/xx/httpd.conf:107) port 80 namevhost www.xx.co.uk (/usr/local/directadmin/data/users/xx/httpd.conf:151) port 80 namevhost www.xx.co.uk (/usr/local/directadmin/data/users/xx/httpd.conf:195) <virtual ip address>:443 is a NameVirtualHost default server www.xx.com (/usr/local/directadmin/data/users/xx/httpd.conf:61) port 443 namevhost www.xx.com (/usr/local/directadmin/data/users/xx/httpd.conf:61) <server ip>:80 is a NameVirtualHost default server localhost (/etc/httpd/conf/extra/httpd-vhosts.conf:29) port 80 namevhost localhost (/etc/httpd/conf/extra/httpd-vhosts.conf:29) port 80 namevhost www.xx.co.uk (/usr/local/directadmin/data/users/admin/httpd.conf:16)

    Read the article

  • MySQL Column Value Pivot

    - by manyxcxi
    I have a MySQL InnoDB table laid out like so: id (int), run_id (int), element_name (varchar), value (text), line_order, column_order `MyDB`.`MyTable` ( `id` bigint(20) NOT NULL, `run_id` int(11) NOT NULL, `element_name` varchar(255) NOT NULL, `value` text, `line_order` int(11) default NULL, `column_order` int(11) default NULL It is used to store data generated by a Java program that used to output this in CSV format, hence the line_order and column_order. Lets say I have 2 entries (according to the table description): 1,1,'ELEMENT 1','A',0,0 2,1,'ELEMENT 2','B',0,1 I want to pivot this data in a view for reporting so that it would look like more like the CSV would, where the output would look this: --------------------- |ELEMENT 1|ELEMENT 2| --------------------- | A | B | --------------------- The data coming in is extremely dynamic; it can be in any order, can be any of over 900 different elements, and the value could be anything. The Run ID ties them all together, and the line and column order basically let me know where the user wants that data to come back in order.

    Read the article

< Previous Page | 705 706 707 708 709 710 711 712 713 714 715 716  | Next Page >