Search Results

Search found 11146 results on 446 pages for 'dynamic queries'.

Page 48/446 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Results from two queries at once in sqlite?

    - by SF.
    I'm currently trying to optimize the sluggish process of retrieving a page of log entries from the SQLite database. I noticed I almost always retrieve next entries along with count of available entries: SELECT time, level, type, text FROM Logs WHERE level IN (%s) ORDER BY time DESC, id DESC LIMIT LOG_REQ_LINES OFFSET %d* LOG_REQ_LINES ; together with total count of records that can match current query: SELECT count(*) FROM Logs WHERE level IN (%s); (for a display "page n of m") I wonder, if I could concatenate the two queries, and ask them both in one sqlite3_exec() simply concatenating the query string. How should my callback function look then? Can I distinguish between the different types of data by argc? What other optimizations would you suggest?

    Read the article

  • Similar SQL queries returning different results...

    - by Pablo
    Here are the SQL Queries: $sql1 = "SELECT count(thread) AS total FROM comments WHERE thread=1 AND parent_id=0 "; $sql2 = "SELECT count(thread) AS total FROM comments, users WHERE thread=1 AND parent_id=0 AND users.user_id=comments.user_id "; $sql3 = "SELECT comments.*, users.username AS username FROM comments, users WHERE thread=1 AND parent_id=0 AND users.user_id=comments.user_id ORDER BY date LIMIT 10, 5 "; My question is why would $sql1 and $sql2 would return two different results? $sql1 returns 61 rows $sql2 returns 56 rows The 5th line in $sql2 is just for testing, is not required, is just a variation of $sql1 which gets the total rows for a pagination.

    Read the article

  • Optimal search queries

    - by Macros
    Following on from my last question http://stackoverflow.com/questions/2788082/sql-server-query-performance, and discovering that my method of allowing optional parameters in a search query is sub optimal, does anyone have guidelines on how to approach this? For example, say I have an application table, a customer table and a contact details table, and I want to create an SP which allows searching on some, none or all of surname, homephone, mobile and app ID, I may use something like the following: select * from application a inner join customer c on a.customerid = a.id left join contact hp on (c.id = hp.customerid and hp.contacttype = 'homephone') left join contact mob on (c.id = mob.customerid and mob.contacttype = 'mobile') where (a.ID = @ID or @ID is null) and (c.Surname = @Surname or @Surname is null) and (HP.phonenumber = @Homphone or @Homephone is null) and (MOB.phonenumber = @Mobile or @Mobile is null) The schema used above isn't real, and I wouldn't be using select * in a real world scenario, it is the construction of the where clause I am interested in. Is there a better approach, either dynamic sql or an alternative which can achieve the same result, without the need for many nested conditionals. Some SPs may have 10 - 15 criteria used in this way

    Read the article

  • Django: optimizing queries

    - by Josh
    I want to list the number of items for each list. How can I find this number in a single query, rather than a query for each list? Here is a simplified version of my current template code: {% for list in lists %} <li> {{ listname }}: {% with list.num_items as item_count %} {{ item_count }} item{{ item_count|pluralize }} {% endwith %} </li> {% endfor %} lists is passed as: List.objects.filter(user=user) and num_items is a property of the List model: def _get_num_items(self): return self.item_set.filter(archived=False).count() num_items = property(_get_num_items) This queries SELECT COUNT(*) FROM "my_app_item" WHERE... n times, where n is the number of lists. Is it possible to make a single query here?

    Read the article

  • Ruby on Rails: Best way to save search queries in a database

    - by Adam Templeton
    For a RoR app I'm helping develop, I need to save all search queries in a database so I can analyze them later. My plan right now is to create a Result model and table, and just save each search query's text in that table, along with a user's ID, the time, etc. However, the app has about 15,000 users, so I'm afraid the single table approach won't be super efficient when it comes time to parse that data. (The database is setup via MySQL, if that factors in at all.) Am I just being paranoid? Is there a Ruby gem that handles this sort of thing, or a better approach I could take? Any input would be appreciated.

    Read the article

  • AS or not to AS, queries

    - by zeMinimalist
    I'm fairly new to PHP/MySql and using queries in general. I was just wondering if there's any benefit to using "AS" in a query other than trying to make it look cleaner? Does it speed up the query at all? I probably could have figured this out by a google search but I wanted to ask my first question and see how this works. I WILL select an answer (unlike some people...) with: SELECT news.id as id news.name as name FROM news without: SELECT news.id news.name FROM news A more complex example from a many-to-many relationship tutorial I found: SELECT c.name, cf.title FROM celebrities AS c JOIN ( SELECT icf.c_id, icf.f_id, f.title FROM int_cf AS icf JOIN films AS f ON icf.f_id = f.f_id ) AS cf ON c.c_id = cf.c_id ORDER BY c.c_id ASC

    Read the article

  • Optimizing MySQL queries with IN operator

    - by Arkadiusz Kondas
    I have a MySQL database with a fairly large table where the products are. Each of them has its own id and categoryId field where there is a category id belongs to this product. Now I have a query that pulls out products from given categories such as: SELECT * FROM products WHERE categoryId IN ( 1, 2, 3, 4, 5, 34, 6, 7, 8, 9, 10, 11, 12 ) Of course, come a WHERE clause and ORDER BY sort but not in this thing. Let's say that these products is 250k and the visits are over 100k per day. Under such conditions in the table slow_log registered weight of these queries with large generation time. Do you have any ideas how to optimize the given problem? Table engine is MyISAM.

    Read the article

  • Mongo: Finding from multiple queries

    - by waxical
    New to Mongo here. I'm using the PHP lib and trying to work out how I can find in a collection from multiple queries. I could do this by repeating the query with a different query, but I wondered if it can be done in one. I.e. $idsToLookFor = array(2124,4241,5553); $query = $db->thisCollection->find(array('id' => $idsToLookFor)); That's what I'd like to do. However it doesn't work. What I'm trying to do is find a set of results for all the id's at one time. Possible or just do a findOne on each with a foreach/for?

    Read the article

  • Complex SQL queries (DELETE)?

    - by Joe
    Hello all, I'm working with three tables, and for simplicity's sake let's call them table A, B, and C. Both tables A and B have a column called id, as well as one other column, Aattribute and Battribute, respectively. Column c also has an id column, and two other columns which hold values for A.id and B.id. Now, in my code, I have easy access to values for both Aattribute and Battribute, and want to delete the row at C, so effectively I want to do something like this: DELETE FROM C WHERE aid=(SELECT id FROM A WHERE Aattribute='myvalue') AND bid=(SELECT id FROM B WHERE Battribute='myothervalue') But this obviously doesn't work. Is there any way to make a single complex query, or do I have to run three queries, where I first get the value of A.id using a SELECT with 'myvalue', then the same for B.id, then use those in the final query? [Edit: it's not letting me comment, so in response to the first comment on this: I tried the above query and it did not work, I figured it just wasn't syntactically correct. Using MS Access, for what it's worth. ]

    Read the article

  • Can I make my drives visible and change their partition type without losing my data?

    - by user165408
    I have made a lot of mistakes and now I cannot see my hard disk nor I can start my operating system on my laptop. All my passwords and important files on my hdd without any backup. I followed this course of action Changed my hard disk partitions to dynamic just for getting 5th partition. (1st mistake) Decreased partitions to 4 again. Backed up operating system from 4th to 3rd partition with Norton Ghost. Booted from a live CD for Windows XP. Formatted 4th partition and moved my all important data from 1st and 2nd partitions to the 4th partition. Deleted 1st and 2nd partitions and got 1 partition from half of empty space. So I have just 3 partitions and empty space between 1st and 2nd partitions. Tried to install Windows 8 to the first partition but it did not allow because it is dynamic. Also it did not allow to install to other partitions. Tried to install Windows XP to the 1st partition but it said if I continue I cannot use other drivers. Therefore I escaped from installing it. Booted from the Windows XP live CD then increased 1st partiton to less than 400mb of empty space. Therefore I thought it will be adjacent but it was shown as 2 partitions. In my computer I see just 3 drivers. Using Norton Ghost I recovered my OS to the 1st partition. (2nd mistake it was on 4th partition originally) Booted from a Windows XP live CD I tried to install bcdedit to the Windows XP live CD but it did not work. Then I tried to install EaseUS Partition Master Home Edition. It was installed with errors then I start it and it showed me an error like there is no hard disk. I looked to my PC and my drivers were not there. Booted from the Norton Ghost CD and it did not show me my drivers either, but before I was able to see them. I checked numbers of partition shown by the Norton Ghost utility and they are still have same numbers so I have to see my drivers but I cannot see them now. My hard disk is shown as extarnal dynamic now so I cannot see any drive in my PC in the live Windows XP. There are two options; first one is import extarnal disk and second one is convert disk to basic. Will they delete my data? I fear booting from CDs like Windows XP live CD, Norton Ghost CD, and the operating system CD/DVD, because they may overwrite a few MB their data to my data. These recover tools are already exist in Windows XP live CD by The Ultimate Boot CD for Windows. Can any of them help me? CompuAppa SwissKnife V3 DBXtract Disk Investigator Fab's AutoBackup 2.0 FileRecovery Floppy Repair Free Undelete Handy Recovery Recovery Manager Restorastion Restorastion Help File by UBCD4Win UnChk Unstoppable Copier Finally How can I make it so that my drives are visible again without losing my data? How can I convert my dynamic partitions to basic without losing my data?

    Read the article

  • Firefox add-on development: Register global dynamic custom keyboard shortcuts

    - by dezwart
    I have been tasked with developing a Firefox add-on that is capable of registering global keyboard shortcuts (ones that will work throughout all areas of Firefox) that will open up the side-bar and execute an XMLRPC request based on previously recorded input. The idea here is that there will be many potential XMLRPC requests that the user will want to execute via a keyboard shortcut. Currently, the add-on is capable of handling pre-defined static keyboard shortcuts via the Firefox overlay. What I would like to achieve, is to allow the user to register their own dynamic custom keyboard shortcut. There is an add-on that currently has some of this functionality, called Keyconfig. I'm not keen on having to ask users to install a second add-on to define their own shortcuts. It also seems that using the dynamic keyboard shortcut registration method in Keyconfig would require the user to close all Firefox windows before the dynamic shortcut is made available. What I would like to know is: Is an XPCOM component the best way to register dynamic keyboard shortcuts from within a Firefox add-on? Is there a way to register the keyboard shortcut so that it is immediately available to all Firefox windows, without having to close the windows beforehand?

    Read the article

  • How to optimize an SQL query with many thousands of WHERE clauses

    - by bugaboo
    I have a series of queries against a very mega large database, and I have hundreds-of-thousands of ORs in WHERE clauses. What is the best and easiest way to optimize such SQL queries? I found some articles about creating temporary tables and using joins, but I am unsure. I'm new to serious SQL, and have been cutting and pasting results from one into the next. SELECT doc_id, language, author, title FROM doc_text WHERE language='fr' OR language='es' SELECT doc_id, ref_id FROM doc_ref WHERE doc_id=1234567 OR doc_id=1234570 OR doc_id=1234572 OR doc_id=1234596 OR OR OR ... SELECT ref_id, location_id FROM ref_master WHERE ref_id=098765 OR ref_id=987654 OR ref_id=876543 OR OR OR ... SELECT location_id, location_display_name FROM location SELECT doc_id, index_code, FROM doc_index WHERE doc_id=1234567 OR doc_id=1234570 OR doc_id=1234572 OR doc_id=1234596 OR OR OR x100,000 These unoptimized query can take over 24 hours each. Cheers.

    Read the article

  • get foreign key objects in a single query - Django

    - by John
    Hi I have 2 models in my django code: class ModelA(models.Model): name = models.CharField(max_length=255) description = models.CharField(max_length=255) created_by = models.ForeignKey(User) class ModelB(models.Model): category = models.CharField(max_length=255) modela_link = models.ForeignKey(ModelA, 'modelb_link') functions = models.CharField(max_length=255) created_by = models.ForeignKey(User) Say ModelA has 100 records, all of which may or may not have links to ModelB Now say I want to get a list of every ModelA record along with the data from ModelB I would do: list_a = ModelA.objects.all() Then to get the data for ModelB I would have to do for i in list_a: i.additional_data = i.modelb_link.all() However this runs a query on every instance of i. Thus making 101 queries to run. Is there any way of running this all in just 1 query. Or at least less than the 101 queries. I've tried putting in ModelA.objects.select_related().all() but this didn't seem to have any effect. Thanks

    Read the article

  • PHP Startup: Unable to load dynamic library '/usr/lib64/php/modules/json.so' undefined symbol: ZVAL_DELREF

    - by crmpicco
    I have an issue where I am unable to use JSON, which would appear to be because of the following error. There is another thread on this forum this touches on a similar issue, but it's not quite the same. I am using CentOS 5.6 and have the following pear packages installed: [crmpicco@eq-www-php53 ~]$ pear list PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib64/php/modules/json.so' - /usr/lib64/php/modules/json.so: undefined symbol: ZVAL_DELREF in Unknown on line 0 Installed packages, channel pear.php.net: ========================================= Package Version State Archive_Tar 1.3.7 stable Auth_SASL 1.0.2 stable Console_Getopt 1.3.1 stable Image_Barcode 1.1.2 stable Mail 1.1.14 stable Net_SMTP 1.2.10 stable Net_Socket 1.0.8 stable PEAR 1.9.4 stable Structures_Graph 1.0.4 stable XML_RPC 1.5.4 stable XML_Util 1.2.1 stable json 1.2.1 stable and have the following PHP packages installed: [crmpicco@eq-www-php53 ~]$ yum list installed | grep php php.x86_64 5.3.10-1.w5 installed php-cli.x86_64 5.3.10-1.w5 installed php-common.x86_64 5.3.10-1.w5 installed php-devel.x86_64 5.3.10-1.w5 installed php-gd.x86_64 5.3.10-1.w5 installed php-ldap.x86_64 5.3.10-1.w5 installed php-mcrypt.x86_64 5.3.10-1.w5 installed php-mysql.x86_64 5.3.10-1.w5 installed php-pdo.x86_64 5.3.10-1.w5 installed php-pear.noarch 1:1.9.4-1.w5 installed php-pear-Net-Socket.noarch 1.0.8-1.el5.centos installed php-soap.x86_64 5.3.10-1.w5 installed php-xml.x86_64 5.3.10-1.w5 installed The error: [crmpicco@eq-www-php53 ~]$ php -v PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib64/php/modules/json.so' - /usr/lib64/php/modules/json.so: undefined symbol: ZVAL_DELREF in Unknown on line 0 PHP 5.3.10 (cli) (built: Feb 2 2012 23:23:12) Copyright (c) 1997-2012 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies My repolist reads as: [crmpicco@eq-www-php53 ~]$ yum repolist Loaded plugins: changelog, fastestmirror Excluding Packages in global exclude list Finished repo id repo name status base CentOS-5 - Base 3,548+43 epel Extra Packages for Enterprise Linux 5 - x86_64 6,815+156 extras CentOS-5 - Extras 245+23 rpmforge Red Hat Enterprise 5 - RPMforge.net - dag 11,016+67 updates CentOS-5 - Updates 233 webtatic Webtatic Repository 5 - x86_64 211+183 repolist: 22,068 I am getting HTTP 500 errors everywhere that I use JSON so my application is non functional right now.

    Read the article

  • Using dd-wrt Dynamic DNS client with CloudFlare

    - by Roman
    I'm trying to configure Dynamic DNS client on my router with dd-wrt (v24-sp2) firmware so it would dynamically change IP address in one of the DNS records. Unfortunately I encountered a problem… Here is an example request from their ddclient configuration: https://www.cloudflare.com/api.html?a=DIUP&u=<my_login>&tkn=<my_token>&ip=<my_ip>&hosts=<my_record> It works if I use it in browser, but in dd-wrt I get this output: Tue Jan 24 00:36:47 2012: INADYN: Started 'INADYN Advanced version 1.96-ADV' - dynamic DNS updater. Tue Jan 24 00:36:47 2012: I:INADYN: IP address for alias '<my_record>' needs update to '<my_ip>' Tue Jan 24 00:36:48 2012: W:INADYN: Error validating DYNDNS svr answer. Check usr,pass,hostname! (HTTP/1.1 303 See Other Server: cloudflare-nginx Date: Mon, 23 Jan 2012 14:36:48 GMT Content-Type: text/plain Connection: close Expires: Sun, 25 Jan 1981 05:00:00 GMT Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Location: https://www.cloudflare.com/api.html?a=DIUP&u=<my_login>&tkn=<my_token>&ip=<my_ip>&hosts=<my_record> Vary: Accept-Encoding Set-Cookie: __cfduid=<id>; expires=Mon, 23-Dec-2019 23:50:00 GMT; path=/; domain=.cloudflare.com Set-Cookie: __cfduid=<id>; expires=Mon, 23-Dec-2019 23:50:00 GMT; path=/; domain=.www.cloudflare.com You must include an `a' paramiter, with a value of DIUP|wl|chl|nul|ban|comm_news|devmode|sec_lvl|ipv46|ob|cache_lvl|fpurge_ts|async|pre_purge|minify|stats|direct|zone_check|zone_ips|zone_errors|zone_agg|zone_search|zone_time|zone_grab|app|rec_se URL from "Location" works perfectly and parameter "a" is included. What's the problem?

    Read the article

  • Setup Reverse DNS with Cpanel and WHM?

    - by m3d
    I needed to set-up a reverse DNS via cpanel. I followed the steps in this tutorial but it didn't work: http://docs.cpanel.net/twiki/bin/view/11_30/WHMDocs/RdnsForBind. I use my own name servers registered with go-daddy. But I am with VPS hosting company. I did use a new serial number and exactly as the tutorial however didnt seems to be working When I check this via windows nslookup {ip-address} I still get the my hosting company name, when reversed.

    Read the article

  • DNS servers not set properly? Site failing to load by hostname, though loads fine by IP

    - by Crashalot
    We run www.tekiki.com. Some users, including us, cannot reach www.tekiki.com because of DNS issues. The site resolves fine on the desktop, but it fails from our iPhones and iPads. This doesn't happen to everyone. We noticed the problem yesterday, then set our DNS servers to Cloudflare's DNS servers, hoping that would fix things. Accessing the site by IP addr loads the site fine. Two questions: 1) Does anyone know what the solution is? 2) Should we use other DNS servers besides Cloudflare?

    Read the article

  • Back from Teched US

    - by gsusx
    It's been a few weeks since I last blogged and, trust me, I am not happy about it :( I have been crazily busy with some of our projects at Tellago which you are going to hear more about in the upcoming weeks :) I was so busy that I didn't even have time to blog about my sessions at Teched US last week. This year I ended up presenting three sessions on three different tracks: BIE403 | Real-Time Business Intelligence with Microsoft SQL Server 2008 R2 Session Type: Breakout Session Real-time business...(read more)

    Read the article

  • Tellago & Tellago Studios at Microsoft TechReady

    - by gsusx
    This week Microsoft is hosting the first edition of their annual TechReady conference. Even though TechReady is an internal conference, Microsoft invited us to present a not one but two sessions about some our recent work. We are particularly proud of the fact that one of those sessions is about our SO-Aware service registry. We see this as a recognition to the growing popularity of SO-Aware as the best Agile SOA governance solution in the Microsoft platform. Well, on Tuesday I had the opportunity...(read more)

    Read the article

  • Creating PDF documents dynamically using Umbraco and XSL-FO part 2

    - by Vizioz Limited
    Since my last post I have made a few modifications to the PDF generation, the main one being that the files are now dynamically renamed so that they reflect the name of the case study instead of all being called PDF.PDF which was not a very helpful filename, I just wanted to get something live last week, so decided that something was better than nothing :)The issue with the filenames comes down to the way that the PDF's are being generated by using an alternative template in Umbraco, this means that all you need to do is add " /pdf " to the end of a case study URL and it will create a PDF version of the case study. The down side is that your browser will merrily download the file and save it as PDF.PDF because that is the name of the last part of the URL.What you need to do is set the content-disposition header to be equal to the name you would like the file use, Darren Ferguson mentioned this on the Change the name of the PDF forum post.We have used the same technique for downloading dynamically generated excel files, so I thought it would be useful to create a small macro to set both this header and also to set the caching headers to prevent any caching issues, I think in the past we have experienced all possible issues, including various issues where IE behaves differently to other browsers when you are using SSL and so the below code should work in all situations!The template for the PDF alternative template is very simple:<%@ Master Language="C#" MasterPageFile="~/umbraco/masterpages/default.master" AutoEventWireup="true" %><asp:Content ID="Content1" ContentPlaceHolderID="ContentPlaceHolderDefault" runat="server"> <umbraco:Macro Alias="PDFHeaders" runat="server"></umbraco:Macro> <umbraco:Macro xsl="FO-CaseStudy.xslt" Alias="PDFXSLFO" runat="server"></umbraco:Macro></asp:Content>The following code snippet is the XSLT macro that simply creates our file name and then passes the file name into the helper function:<xsl:template match="/"> <xsl:variable name="fileName"> <xsl:text>Vizioz_</xsl:text> <xsl:value-of select="$currentPage/@nodeName" /> <xsl:text>_case_study.pdf</xsl:text> </xsl:variable> <xsl:value-of select="Vizioz.Helper:AddDocumentDownloadHeaders('application/pdf', $fileName)"/> </xsl:template>And the following code is the helper function that clears the current response and adds all the appropriate headers:public static void AddDocumentDownloadHeaders(string contentType, string fileName){ HttpResponse response = HttpContext.Current.Response; HttpRequest request = HttpContext.Current.Request; response.Clear(); response.ClearHeaders(); if (request.IsSecureConnection & request.Browser.Browser == "IE") { // Don't use the caching headers if the browser is IE and it's a secure connection // see: http://support.microsoft.com/kb/323308 } else { // force not using the cache response.AppendHeader("Cache-Control", "no-cache"); response.AppendHeader("Cache-Control", "private"); response.AppendHeader("Cache-Control", "no-store"); response.AppendHeader("Cache-Control", "must-revalidate"); response.AppendHeader("Cache-Control", "max-stale=0"); response.AppendHeader("Cache-Control", "post-check=0"); response.AppendHeader("Cache-Control", "pre-check=0"); response.AppendHeader("Pragma", "no-cache"); response.Cache.SetCacheability(HttpCacheability.NoCache); response.Cache.SetNoStore(); response.Cache.SetExpires(DateTime.UtcNow.AddMinutes(-1)); } response.AppendHeader("Expires", DateTime.Now.AddMinutes(-1).ToLongDateString()); response.AppendHeader("Keep-Alive", "timeout=3, max=993"); response.AddHeader("content-disposition", "attachment; filename=\"" + fileName + "\""); response.ContentType = contentType;}I will write another blog soon with some more details about XSL-FO and how to create the PDF's dynamically.Please do re-tweet if you find this interest :)

    Read the article

  • PDF from Umbraco | Creating PDF case studies from data in the Umbraco CMS

    - by Vizioz Limited
    Last week we launched the first version of our website based on Umbraco 4.5.2 and this week we have just added a bit of extra functionality to the case studies section which enables you to download the case studies as PDF documents.To do this we used the PDF Creator package by Darren Ferguson, this is actually a wrapper around a product from a company called Ibex, which is where you can download documentation for the mark up required.The way Darren has made the implementation is really simple for anyone already familiar with the Umbraco CMS. You simple create a new template and call a Usercontrol macro, this then does the magic in the background and passes an XSLT file to the ibex engine.What you need to be aware of is that you need to learn a new mark up language called XSL-FO this is actually part of the XSL 1.0 specification and is a language used to express print layouts.As an indication of timescale, from knowing nothing about XSL-FO to the finished product that you can see on the website now has taken me 2 days of learning and just fiddling with the mark up to get the final result.If anyone is interested I might post some code snippets to show you how some of it is done, I would also be really interested to have some feedback about the PDF layout and what you like and don't like about it.Cheers,ChrisPosted using BlogPress from my iPad

    Read the article

  • Tellago && Tellago Studios 2010

    - by gsusx
    With 2011 around the corner we, at Tellago and Tellago Studios , we have been spending a lot of times evaluating our successes and failures (yes those too ;)) of 2010 and delineating some of our goals and strategies for 2011. When I look at 2010 here are some of the things that quickly jump off the page: Growing Tellago by 300% Launching a brand new company: Tellago Studios Expanding our customer base Establishing our business intelligence practice http://tellago.com/what-we-say/events/business-intelligence...(read more)

    Read the article

  • DonXml does WCF in NYC

    - by gsusx
    Tomorrow is WCF day in New York city!!!!! My good friend and Tellago's CTO Don Demsak will be doing a session WCF Data and RIA Services at the WCF fire-starter event to be hosted at the Microsoft offices in New York city. Don has a encyclopedic knowledge of both technologies and will be sharing lots of best practices learned from applying these technologies in large service oriented environments. In addition to Don, my crazy Cuban friend Miguel Castro will also be presenting three sessions at the...(read more)

    Read the article

  • Two domains, two servers, one dynamic IP address

    - by giantman
    I have two domains hi.org and bye.net and one dynamic IP address and two servers. I want to attach one domain bye.net to server1 and hi.org to server2. I'm using Apache wamp 2.0i. I have two servers behind one router with a dynamic IP address #httpd.conf file additions <IfModule mod_proxy.c> ProxyRequests Off <Proxy *> Order deny,allow Allow from all </Proxy> </IfModule> #vhost file additions NameVirtualHost *:80 #default <VirtualHost *:80> DocumentRoot "c:/wamp/www/fallback" </VirtualHost> # Server 1 <VirtualHost *:80> DocumentRoot "c:/wamp/www" ServerName h**p://bye.net ServerAlias bye.net </VirtualHost> # Server 2 <VirtualHost *:80> ProxyPreserveHost On ProxyPass / h**p://192.168.1.119/ DocumentRoot "g:/wamp/www" ServerName h**p://hi.org ServerAlias hi.org </VirtualHost> After doing all this I fallback to server1 only I don't get the page hi.org I only get the page bye.net, I don't even get the default fallback page which gets executed when a person enters IP address but not the domain name. I use Windows 7 (server 2) and Windows XP (server 1) UPDATE: I needed to remove DocumentRoot "g:/wamp/www" line :D it was there by mistake! things are working fine now. But one thing: the URL gets replaced by the local ip address any way to not make that happen?

    Read the article

  • C# 4.0 in a Nutshell, Fourth Edition

    - by outcoldman
    Just became a lucky owner of this book C# IN A NUTSHELL 4th edition. This is a fourth edition of this book’s series. I saw previous third edition of this book, we presented it on one of our events at Yaroslavl State University, but that book was a Russian translated version and published in Russia, this is was bad side of that book – all books at Russia printed on really bad paper. I should say that I didn’t read this book by end, but already I was surprised. Why? Why I heard a lot about Richter CLR via C# (English version of 3rd edition of this book I already have, and this book are waiting my attention), and just a few words about C# IN A NUTSHELL, at least in my sphere. I just listen once about this book at one of the podcast of Alt.Net group, and this words was Richter it is really good book, and C# IN A NUTSHELL it is a good handbook. My opinion is - you should read Richter if you want to develop with .NET. But if you want to develop on .NET with C# you should read C# IN A NUTSHELL too. Read more...

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >