Search Results

Search found 56616 results on 2265 pages for 'lukasz romaszewski(at)oracle com'.

Page 118/2265 | < Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >

  • Refresh bounded taskflows across regions using Contextual Events

    - by raghu.yadav
    Usecases: 1) Data Change in left region inputText field reflect changes in right region using contextual event. example by Frank Nimphius :Value change event refresh across regions using Contextual Events 2) Select Tree node in left region reflects dependent detail form in right region using dynamic regions and Contextual Events. example by Frank Nimphius:Example6-RangeCtx.unzip More related examples: http://thepeninsulasedge.com/frank_nimphius/2008/02/07/adf-faces-rc-refreshing-a-table-ui-from-a-contextual-event/ http://www.oracle.com/technology/products/jdev/tips/fnimphius/generictreeselectionlistener/index.html http://www.oracle.com/technology/products/jdev/tips/fnimphius/syncheditformwithtree/index.html http://biemond.blogspot.com/2009/01/passing-adf-events-between-task-flow.html http://www.oracle.com/technology/products/jdev/tips/fnimphius/opentaskflowintab/index.html http://lucbors.blogspot.com/2010/03/adf-11g-contextual-event-framework.html http://thepeninsulasedge.com/blog/?cat=2 http://www.ora600.be/news/adf-contextual-events-11g-r1-ps1

    Read the article

  • Error while detecting start profile for instance with ID _u

    - by Techboy
    I am upgrading a SAP Java instance and am getting the error shown in the log below. There are not backup profile files in the profiles directory. The string _u does not exist in the default, start or instance profile for instance SCS01. Please can you suggest what the issue might be? Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.initializexAbstractPhaseType.javax758x [Thread[main,5,main]]: Phase PREPARE/INIT/DETERMINE_PROFILES has been started. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.initializexAbstractPhaseType.javax759x [Thread[main,5,main]]: Phase type is com.sap.sdt.j2ee.phases.PhaseTypeDetermineProfiles. Mar 13, 2010 12:09:01 PM [Info]: ...ap.sdt.j2ee.phases.PhaseTypeDetermineProfiles.checkVariablesxPhaseTypeDetermineProfiles.javax284x [Thread[main,5,main]]: All 4 required variables exist in variable handler. Mar 13, 2010 12:09:01 PM [Info]: ....j2ee.tools.sysinfo.AbstractInfoController.updateProfileVariablexAbstractInfoController.javax302x [Thread[main,5,main]]: Parameter /J2EE/StandardSystem/DefaultProfilePath has been detected. Parameter value is \server1\SAPMNT\PI7\SYS\profile\DEFAULT.PFL. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.j2ee.tools.sysinfo.ProfileDetector.detectInstancexProfileDetector.javax340x [Thread[main,5,main]]: Instance SCS01 with profile \server1\SAPMNT\PI7\SYS\profile\PI7_SCS01_server1, start profile \server1\SAPMNT\PI7\SYS\profile\START_SCS01_server1, and host server1 has been detected. Mar 13, 2010 12:09:01 PM [Error]: com.sap.sdt.ucp.phases.AbstractPhaseType.doExecutexAbstractPhaseType.javax862x [Thread[main,5,main]]: Exception has occurred during the execution of the phase. Mar 13, 2010 12:09:01 PM [Error]: com.sap.sdt.j2ee.tools.sysinfo.ProfileDetector.detectInstancexProfileDetector.javax297x [Thread[main,5,main]]: Error while detecting start profile for instance with ID _u. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.cleanupxAbstractPhaseType.javax905x [Thread[main,5,main]]: Phase PREPARE/INIT/DETERMINE_PROFILES has been completed. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.cleanupxAbstractPhaseType.javax906x [Thread[main,5,main]]: Start time: 2010/03/13 12:09:01. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.cleanupxAbstractPhaseType.javax908x [Thread[main,5,main]]: End time: 2010/03/13 12:09:01. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.cleanupxAbstractPhaseType.javax909x [Thread[main,5,main]]: Duration: 0:00:00.078. Mar 13, 2010 12:09:01 PM [Info]: com.sap.sdt.ucp.phases.AbstractPhaseType.cleanupxAbstractPhaseType.javax910x [Thread[main,5,main]]: Phase status is error.

    Read the article

  • Plan Caching and Query Memory Part I – When not to use stored procedure or other plan caching mechanisms like sp_executesql or prepared statement

    - by sqlworkshops
      The most common performance mistake SQL Server developers make: SQL Server estimates memory requirement for queries at compilation time. This mechanism is fine for dynamic queries that need memory, but not for queries that cache the plan. With dynamic queries the plan is not reused for different set of parameters values / predicates and hence different amount of memory can be estimated based on different set of parameter values / predicates. Common memory allocating queries are that perform Sort and do Hash Match operations like Hash Join or Hash Aggregation or Hash Union. This article covers Sort with examples. It is recommended to read Plan Caching and Query Memory Part II after this article which covers Hash Match operations.   When the plan is cached by using stored procedure or other plan caching mechanisms like sp_executesql or prepared statement, SQL Server estimates memory requirement based on first set of execution parameters. Later when the same stored procedure is called with different set of parameter values, the same amount of memory is used to execute the stored procedure. This might lead to underestimation / overestimation of memory on plan reuse, overestimation of memory might not be a noticeable issue for Sort operations, but underestimation of memory will lead to spill over tempdb resulting in poor performance.   This article covers underestimation / overestimation of memory for Sort. Plan Caching and Query Memory Part II covers underestimation / overestimation for Hash Match operation. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   To read additional articles I wrote click here.   In most cases it is cheaper to pay for the compilation cost of dynamic queries than huge cost for spill over tempdb, unless memory requirement for a stored procedure does not change significantly based on predicates.   The best way to learn is to practice. To create the below tables and reproduce the behavior, join the mailing list by using this link: www.sqlworkshops.com/ml and I will send you the table creation script. Most of these concepts are also covered in our webcasts: www.sqlworkshops.com/webcasts   Enough theory, let’s see an example where we sort initially 1 month of data and then use the stored procedure to sort 6 months of data.   Let’s create a stored procedure that sorts customers by name within certain date range.   --Example provided by www.sqlworkshops.com create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1)       end go Let’s execute the stored procedure initially with 1 month date range.   set statistics time on go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-31' go The stored procedure took 48 ms to complete.     The stored procedure was granted 6656 KB based on 43199.9 rows being estimated.       The estimated number of rows, 43199.9 is similar to actual number of rows 43200 and hence the memory estimation should be ok.       There was no Sort Warnings in SQL Profiler.      Now let’s execute the stored procedure with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 679 ms to complete.      The stored procedure was granted 6656 KB based on 43199.9 rows being estimated.      The estimated number of rows, 43199.9 is way different from the actual number of rows 259200 because the estimation is based on the first set of parameter value supplied to the stored procedure which is 1 month in our case. This underestimation will lead to sort spill over tempdb, resulting in poor performance.      There was Sort Warnings in SQL Profiler.    To monitor the amount of data written and read from tempdb, one can execute select num_of_bytes_written, num_of_bytes_read from sys.dm_io_virtual_file_stats(2, NULL) before and after the stored procedure execution, for additional information refer to the webcast: www.sqlworkshops.com/webcasts.     Let’s recompile the stored procedure and then let’s first execute the stored procedure with 6 month date range.  In a production instance it is not advisable to use sp_recompile instead one should use DBCC FREEPROCCACHE (plan_handle). This is due to locking issues involved with sp_recompile, refer to our webcasts for further details.   exec sp_recompile CustomersByCreationDate go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go Now the stored procedure took only 294 ms instead of 679 ms.    The stored procedure was granted 26832 KB of memory.      The estimated number of rows, 259200 is similar to actual number of rows of 259200. Better performance of this stored procedure is due to better estimation of memory and avoiding sort spill over tempdb.      There was no Sort Warnings in SQL Profiler.       Now let’s execute the stored procedure with 1 month date range.   --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-31' go The stored procedure took 49 ms to complete, similar to our very first stored procedure execution.     This stored procedure was granted more memory (26832 KB) than necessary memory (6656 KB) based on 6 months of data estimation (259200 rows) instead of 1 month of data estimation (43199.9 rows). This is because the estimation is based on the first set of parameter value supplied to the stored procedure which is 6 months in this case. This overestimation did not affect performance, but it might affect performance of other concurrent queries requiring memory and hence overestimation is not recommended. This overestimation might affect performance Hash Match operations, refer to article Plan Caching and Query Memory Part II for further details.    Let’s recompile the stored procedure and then let’s first execute the stored procedure with 2 day date range. exec sp_recompile CustomersByCreationDate go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-02' go The stored procedure took 1 ms.      The stored procedure was granted 1024 KB based on 1440 rows being estimated.      There was no Sort Warnings in SQL Profiler.      Now let’s execute the stored procedure with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go   The stored procedure took 955 ms to complete, way higher than 679 ms or 294ms we noticed before.      The stored procedure was granted 1024 KB based on 1440 rows being estimated. But we noticed in the past this stored procedure with 6 month date range needed 26832 KB of memory to execute optimally without spill over tempdb. This is clear underestimation of memory and the reason for the very poor performance.      There was Sort Warnings in SQL Profiler. Unlike before this was a Multiple pass sort instead of Single pass sort. This occurs when granted memory is too low.      Intermediate Summary: This issue can be avoided by not caching the plan for memory allocating queries. Other possibility is to use recompile hint or optimize for hint to allocate memory for predefined date range.   Let’s recreate the stored procedure with recompile hint. --Example provided by www.sqlworkshops.com drop proc CustomersByCreationDate go create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1, recompile)       end go Let’s execute the stored procedure initially with 1 month date range and then with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-30' exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 48ms and 291 ms in line with previous optimal execution times.      The stored procedure with 1 month date range has good estimation like before.      The stored procedure with 6 month date range also has good estimation and memory grant like before because the query was recompiled with current set of parameter values.      The compilation time and compilation CPU of 1 ms is not expensive in this case compared to the performance benefit.     Let’s recreate the stored procedure with optimize for hint of 6 month date range.   --Example provided by www.sqlworkshops.com drop proc CustomersByCreationDate go create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1, optimize for (@CreationDateFrom = '2001-01-01', @CreationDateTo ='2001-06-30'))       end go Let’s execute the stored procedure initially with 1 month date range and then with 6 month date range.   --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-30' exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 48ms and 291 ms in line with previous optimal execution times.    The stored procedure with 1 month date range has overestimation of rows and memory. This is because we provided hint to optimize for 6 months of data.      The stored procedure with 6 month date range has good estimation and memory grant because we provided hint to optimize for 6 months of data.       Let’s execute the stored procedure with 12 month date range using the currently cashed plan for 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-12-31' go The stored procedure took 1138 ms to complete.      2592000 rows were estimated based on optimize for hint value for 6 month date range. Actual number of rows is 524160 due to 12 month date range.      The stored procedure was granted enough memory to sort 6 month date range and not 12 month date range, so there will be spill over tempdb.      There was Sort Warnings in SQL Profiler.      As we see above, optimize for hint cannot guarantee enough memory and optimal performance compared to recompile hint.   This article covers underestimation / overestimation of memory for Sort. Plan Caching and Query Memory Part II covers underestimation / overestimation for Hash Match operation. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   Summary: Cached plan might lead to underestimation or overestimation of memory because the memory is estimated based on first set of execution parameters. It is recommended not to cache the plan if the amount of memory required to execute the stored procedure has a wide range of possibilities. One can mitigate this by using recompile hint, but that will lead to compilation overhead. However, in most cases it might be ok to pay for compilation rather than spilling sort over tempdb which could be very expensive compared to compilation cost. The other possibility is to use optimize for hint, but in case one sorts more data than hinted by optimize for hint, this will still lead to spill. On the other side there is also the possibility of overestimation leading to unnecessary memory issues for other concurrently executing queries. In case of Hash Match operations, this overestimation of memory might lead to poor performance. When the values used in optimize for hint are archived from the database, the estimation will be wrong leading to worst performance, so one has to exercise caution before using optimize for hint, recompile hint is better in this case. I explain these concepts with detailed examples in my webcasts (www.sqlworkshops.com/webcasts), I recommend you to watch them. The best way to learn is to practice. To create the above tables and reproduce the behavior, join the mailing list at www.sqlworkshops.com/ml and I will send you the relevant SQL Scripts.     Register for the upcoming 3 Day Level 400 Microsoft SQL Server 2008 and SQL Server 2005 Performance Monitoring & Tuning Hands-on Workshop in London, United Kingdom during March 15-17, 2011, click here to register / Microsoft UK TechNet.These are hands-on workshops with a maximum of 12 participants and not lectures. For consulting engagements click here.     Disclaimer and copyright information:This article refers to organizations and products that may be the trademarks or registered trademarks of their various owners. Copyright of this article belongs to R Meyyappan / www.sqlworkshops.com. You may freely use the ideas and concepts discussed in this article with acknowledgement (www.sqlworkshops.com), but you may not claim any of it as your own work. This article is for informational purposes only; you use any of the suggestions given here entirely at your own risk.   R Meyyappan rmeyyappan@sqlworkshops.com LinkedIn: http://at.linkedin.com/in/rmeyyappan

    Read the article

  • Forbidden Not authorized to access this feed

    - by user302593
    Hai, I want to create a sites by using java programming. but it contain error like this... Exception in thread "main" com.google.gdata.util.ServiceForbiddenException: Forbidden Not authorized to access this feed at com.google.gdata.client.http.HttpGDataRequest.handleErrorResponse(HttpGDataRequest.java:561) at com.google.gdata.client.http.GoogleGDataRequest.handleErrorResponse(GoogleGDataRequest.java:563) at com.google.gdata.client.http.HttpGDataRequest.checkResponse(HttpGDataRequest.java:536) at com.google.gdata.client.http.HttpGDataRequest.execute(HttpGDataRequest.java:515) at com.google.gdata.client.http.GoogleGDataRequest.execute(GoogleGDataRequest.java:535) at com.google.gdata.client.Service.getFeed(Service.java:1073) at com.google.gdata.client.Service.getFeed(Service.java:936) at com.google.gdata.client.GoogleService.getFeed(GoogleService.java:631) at com.google.gdata.client.Service.getFeed(Service.java:955) at lsites.getSiteFeed(lsites.java:25) at lsites.main(lsites.java:40) what i want to do? Regards, Bhuvana

    Read the article

  • HTML Redirect issue with Apache2

    - by Vijit Jain
    I am facing an issue with the ProxyPass on my Apache server on Ubuntu. I have configured Apache to deal with Virtual Hosts on my server. There is an application with runs on the server and uses ports 8001 8002. I need to do something like www.example.com/demo/origin to display the contents that I would see when I visit www.example.com:8000. The contents to be displayed are a host of HTML pages. This is the section of the virtual host config that has issues ProxyPass /demo/vader http://www.example.com:8001/ ProxyPassReverse /demo/vader http://www.example:8001/ ProxyPass /demo/skywalker http://www.example.com:8002/ ProxyPassReverse /demo/skywalker http://www.example.com:8002/ Now when I visit example.com/demo/skywalker, I see the first page of port 8002, say the login.html page. The second should have been www.example.com/demo/skywalker/userAction.html, instead the server shows www.example.com:8000/login.html. In the error logs I see something like: [Mon Nov 11 18:01:20 2013] [debug] mod_proxy_http.c(1850): proxy: HTTP: FILE NOT FOUND /htdocs/js/demo.72fbff3c9a97f15a4fff28e19b0de909.min.js I do not have any folder htdocs in the system. This is only an issue while viewing .html pages. Otherwise, no such issue occurs. When I visit localhost:8001 it will show any and all contents without any errors or issues. www.example.com/demo/skywalker displays a separate webpage www.example.com/demo/origin displays a different webpage and www.example.com/demo/vader displays a different webpage. I have also tried to use one more type of combination, <Location /demo/origin/> ProxyPass http://localhost:8000/ ProxyPassReverse http://localhost:8000/ ProxyHTMLURLMap http://localhost:8000/ / </Location> This fails as well. I would greatly appreciate if anyone can help me resolve this issue.

    Read the article

  • Managing Multiple dedicated servers centrally using a Web GUI tools?

    - by Sampath
    Application Architecture I am having a single ruby on rails application code running with multiple instances (ie. each client having identical sub domains) running on a multiple dedicated server using phusion passenger + nginx. sub domains setup done using vhost option in nginx passenger module. For Example server 1 serving 1 - 100 client with identical sub domains www.client1.product.com upto www.client100.product.com server 2 serving 101 - 200 client with identical sub domains www.client101.product.com upto www.client200.product.com server 3 serving 201 - 300 client with identical sub domains www.client201.product.com upto www.client300.product.com What my question is i need to centrally manage all my N dedicated servers using an gui tool I am looking for Web GUI tool to manage tasks like 1) backup all mysql databases automatically from all dedicated servers and send it to an some FTP backup drive 2) back files and folders from all dedicated servers and send it to an some FTP backup drive 3) need to manage firewall (CSF http://configserver.com/cp/csf.html) centrally for all dedicated servers 4) look to see server load , bandwidth used in graphical manner for all N no of dedicated servers Note: I am prefer to looking for an open source solution

    Read the article

  • How to direct reverse proxy requests using wildcard vhosts

    - by HonoredMule
    I'm interested in running a reverse proxy with 2-3 virtual machines behind it. Each internal server will run multiple virtual hosts, and rather than manually configuring each individual vhost on the proxy (a variety of vhosts come and go too often for this to be practical), I would like to use something which can employ pattern matching in a sequential order to find the appropriate back-end server. For example: Server 1: *.dev.mysite.com Server 2: *.stage.mysite.com Server 3: *.mysite.com, dev.mysite.com, stage.mysite.com, mysite.com Server 4: * In the above configuration, task.dev.mysite.com would go to Server 1, dev.mysite.com would go to Server 3, yoursite.stage.mysite.com to Server 2, www.mysite.com to Server 3, and yoursite.com to Server 4. I've looked into using Squid, Varnish, and nginx so far. I have my opinions regarding their respective desirability and general suitability, but it's not readily apparent if any of them can handle dynamic server selection in this manner and not require per-vhost configuration. Apache on the other hand can do this handily and simply, but otherwise (aside from being well-known and familiar) seems very poorly suited to the partly-performance-serving task. Performance isn't actually a major concern yet, but it seems foolish to use Apache if another system will perform far better and can also handle the desired 'hands-free' configuration. But so is frequently having to adjust the gateway for all production services and risk network-wide outage...and so also is setting oneself up for longer downtime later if Apache becomes a too-small bottleneck. Which of these (or other) reverse proxies can do it/would do it best? And maybe I should post this as a separate question, but if Apache is the only practical option, how safe/reliable/predictable is apache-mpm-event in apache2.2 (Ubuntu 12.04.1) particularly for a dedicated reverse proxy? As I understand it the Event MPM was declared "safe" as of 2.4 but it's unclear whether reaching stability in 2.4 has any implications for the older (2.2) versions available in official/stable package channels of various distros.

    Read the article

  • Google shows subdomain of main site instead of add on domain URL

    - by Welsher
    I have my host (lunarpages) set up with a few add on domains to my main account. These show up as sub-domains of my main account, but they can be reached by using the new domain I've created. So: subdomain1.domain.com -- www.mynewsite.com subdomain2.domain.com -- www.myothersite.com etc. The problem is, mynewsite.com shows up in google with that domain, but myothersite.com shows up with subdomain2.domain.com. I don't have a clue what might be causing this to happen. If anyone has an advice or can point me in the right direction, I'd really appreciate it! Thanks.

    Read the article

  • Google indexing page with parameters but page is Disallowed in robots.txt

    - by Jakobud
    I have the following in robots.txt: User-agent: * Disallow: /refer.php User-agent: NinjaBot Allow: / Sitemap: http://www.mysite.com/sitemap.xml The refer.php file does various things depending on what GET parameters are passed to it. When I do a Google search, I see tons of results for pages like this: http://www.mysite.com/refer.php?o=23945 http://www.mysite.com/refer.php?o=39858 http://www.mysite.com/refer.php?o=9683 http://www.mysite.com/refer.php?o=10569 http://www.mysite.com/refer.php?o=58304 http://www.mysite.com/refer.php?o=69604 Is the reason that Google is indexing these because I don't have an asterisk * after refer.php in the robots.txt ? Should changing it to Disallow: /refer.php* fix the problem?

    Read the article

  • Weird 301 redirection by google crawler

    - by Ace
    I have some pages on my website www.acethem.com which are having 301 redirection but they are not actually 301 redirects. e.g. www.acethem.com/pastpapers/by-year/2007/ is seen as a 301 redirection to www.acethem.com/pastpapers/by-year by google (I am using "Fetch as google" in webmaster tools. Now more weird: My paginated pages with page = 10 are all redirected to homepage: http://www.acethem.com/pastpapers/o-level/chemistry/page/10/ while http://www.acethem.com/pastpapers/o-level/chemistry/page/9/ is working properly in google crawler. Note that all these pages work fine with no redirect in browsers. Sidenote: on www.acethem.com/pastpapers/by-year/2007/, the facebook share button also points to www.acethem.com/pastpapers/by-year/.

    Read the article

  • apache domain redirect to subfolder

    - by Dennis
    I have a hosting account with godaddy. Its a linux system running apache. The way they do their setup is your primary domain is the root folder. When you add a subdomain its in a subfolder of the root which sucks. I want to setup a subfolder structure to organize my domains.. I called godday support and they said to use redirects.. but did not know how to do that.. How its setup now: primary domain: www.domain.com / sub.domain.com /sub I want to create a directory structure and then redirect to each but only show www.domain.com in the url www.domain.com /domain/www sub.domain.com /domain/sub I tried using: RewriteEngine On RewriteCond %{HTTP_HOST} ^(www.)?domain.com$ RewriteRule ^(/)?$ domain/www [L] but it just changes the url to www.domain.com/domain/www Can this be done in htaccess?

    Read the article

  • Optimisation avec QPixmapCache au redessin d'un widget, un article par Mark Summerfield traduit par

    Bonjour à tous Voici une nouvelle traduction d'un article de Qt Quarterly décrivant une méthode de mise en cache des pixmaps pour optimiser l'affichage des widgets. La méthode est illustré à travers 2 exemples de widgets personnalisés dont le code source est fournis. http://qt-quarterly.developpez.com/qq-12/qpixmapcache/ Pour rappel, les Qt Quarterly sont des articles techniques écris par des spécialistes de Qt. Plusieurs articles ont été traduit en français par l'équipe de rédaction Qt de developpez.com. Retrouvez les traductions à l'adresse suivante : http://qt-quarterly.developpez.com/ ...

    Read the article

  • Apt-get update through tor

    - by Alexander
    I'm trying to update my apt-get list. In my country a lot of sites are blocked or have been blocked from companies. When I use a proxy for the whole system I get errors, tor works perfectly when browsing. My question is can I update apt-get through a connection from tor? I mean I want to unblock the blocked sites using tor connection so I can perform "apt-get update" without errors ... Thanks in advance. Edit BTW : I'm using Ubuntu 13.10 and Tor 0.2.21 alexander@Alexander-PC:~$ sudo apt-get update [sudo] password for alexander: Ign http://extras.ubuntu.com saucy InRelease Ign http://security.ubuntu.com saucy-security InRelease Ign http://us.archive.ubuntu.com saucy InRelease Hit http://extras.ubuntu.com saucy Release.gpg Get:1 http://dl.google.com stable InRelease [1,540 B] 100% [1 InRelease gpgv 1,540 B] [Waiting for headers] [Waiting for headers] [WaSplitting up/var/lib/apt/lists/partial/dl.google.com_linux_chrome_deb_dists_stabIgn http://dl.google.com stable InRelease E: GPG error: http://dl.google.com stable InRelease: Clearsigned file isn't valid, got 'NODATA' (does the network require authentication?

    Read the article

  • ItemSearch totally different from Amazon.com search - What am I doing wrong?

    - by RadiantHex
    I'm using ItemSearch in order to get a list of books ordered by 'salesrank' aka 'bestselling', problem is that the books that pop up for any BrowseNode are totally different from the Amazon list. Test Case Author:"J R R Tolkien", SearchIndex:"Books", Sort:"salesrank" Using the API: J.R.R. Tolkien Boxed Set (The Hobbit and The Lord of the Rings) The Hobbit: 70th Anniversary Edition The Lord of the Rings: 50th Anniversary, One Vol. Edition The Legend of Sigurd and Gudrun The Silmarillion The Lord of the Rings The Children of Hurin The Fellowship of the Ring: Being the First Part of The Lord of the Rings The Return of the King: Being the Third Part of The Lord of the Rings Using Amazon.com Adv. Search: The Lord of the Rings (Trilogy) The Hobbit J.R.R. Tolkien Boxed Set (The Hobbit and The Lord of the Rings) The Legend of Sigurd and Gudrun The Lord of the Rings: 50th Anniversary, One Vol. Edition The Silmarillion The Fellowship of the Ring The Children of Hurin Lord of the Rings, The Return of the King, The (Vol 3) Help would very much appreciated

    Read the article

  • Canonical url for a home page and trailing slashes

    - by serg
    My home page could be potentially linked as: http://example.com http://example.com/ http://example.com/?ref=1 http://example.com/index.html http://example.com/index.html?ref=2 (the same page is served for all those urls) I am thinking about defining a canonical url to make sure google doesn't consider those urls to be different pages: <link rel="canonical" href="/" /> (relative) <link rel="canonical" href="http://example.com/" /> (trailing slash) <link rel="canonical" href="http://example.com" /> (no trailing slash) Which one should be used? I would just slap / but messing with canonical seems like a scary business so I wanted double check first. Is it a good idea at all for defining a canonical url for a home page?

    Read the article

  • Apache - .httaccess RewriteRule from domainA to domainB

    - by milo5b
    Problem: I have a website (mywebsite.com) that was, and partly is, indexed in google. Somebody pointed their own domain (theirsite.com) name to my server and DNS, so it resolves with my IP. Now, probably being an older domain, it outranks me in google, and the pages at my domain are starting to getting de-indexed (probably duplicate content or something). So, for example, my homepage got de-indexed, and their homepage (theirsite.com/) is indexed with my content/code/etc. The same is for other pages (theirsite.com/other/page.html is showing mysite.com/other/page.html) Quick-fix: To quickly fix it, I have added few lines to my PHP code, checking for $_SERVER['HTTP_HOST'], and if different than my domain, redirects to my domain. It does the job, but to me it looks like a dirty solution. Question: I could not find a way to have apache to do this job. I would prefer to find an apache/.htaccess solution to this problem (redirecting all traffic from domainA.com/(.*) to domainB.com/$1), is it possible in any way? Thanks

    Read the article

  • sg_map & lsscsi showing old storage version

    - by PratapSingh
    I am using SUN storage and recently upgraded/refreshed my ISCSI LUN storage. We have replicated old storage to new storage and attached to our servers. I can see at SUN storage side that storage is attached to server and also from server when I run the below command it prints the following output : iscsiadm -m session tcp: [1] 10.1.1.10:3260,2 iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd The above storage is SUN STORAGE 7420 But when I run sg_map or lsscsi command it prints different version: lsscsi disk SUN Sun Storage 7410 1.0 /dev/sda disk SUN Sun Storage 7410 1.0 /dev/sdb disk SUN Sun Storage 7410 1.0 /dev/sdc disk SUN Sun Storage 7410 1.0 /dev/sdd Output of ls on "/dev/disk/by-path/" ls -1 /dev/disk/by-path/ ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-0 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-0-part1 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-18 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-18-part1 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-2 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-2-part1 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-4 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-4-part1 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-6 ip-10.1.1.10:3260-iscsi-iqn.86-03.com.sun:02:afsfsf58-c56a-6ba8-a944-addd258687cd-lun-6-part1 I have rebooted server twice but still I am getting the same output as given above.

    Read the article

  • Apache Tomcat Server Error

    - by Sam....
    I M trying to install Tomcat But Getting this Error Every Time ..whether it is binary or Exe install *SEVERE: Begin event threw exception java.lang.ClassNotFoundException: org.apache.catalina.core.AprLifecycleListener at java.net.URLClassLoader$1.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at org.apache.commons.digester.ObjectCreateRule.begin(ObjectCreateRule.java:204) at org.apache.commons.digester.Rule.begin(Rule.java:152) at org.apache.commons.digester.Digester.startElement(Digester.java:1286) at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) at org.apache.commons.digester.Digester.parse(Digester.java:1572) at org.apache.catalina.startup.Catalina.start(Catalina.java:451) at org.apache.catalina.startup.Catalina.execute(Catalina.java:402) at org.apache.catalina.startup.Catalina.process(Catalina.java:180) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:202) * Can any one please solve this...need urgent rly

    Read the article

  • Failed to fetch URL http://dl-ssl.google.com/android/repository/addons_list-1.xml, reason: Failure initializing default SSL context

    - by user1156220
    Launching the android SDK manager from the command line gets me this error and others like: Failed to fetch URL http://dl-ssl.google.com/android/repository/addons_list-1.xml, reason: Failure initializing default SSL context I've done plenty of searches and have forced http instead of https, created an androidtool.cfg file and added the line sdkman.force.http=true permisions for all files in ~/.android are rw r r. I'm calling android logged in as the owner of those files. I am not using a proxy and I have no anti virus running. I just installed fedora 16 and am not sure of any firewalls running by default. I suspect a permissions problem somewhere along the line. any ideas?

    Read the article

  • Apache2 Manage Server default

    - by Jaime E. Valdez
    I'm trying to setup two domains correctly. I have some issues I hope you can help me. Site one's conf: <VirtualHost myipaddress:80> ServerName www.domain1.com ServerAdmin hostmaster@domain1.com DocumentRoot /home/domain1/public_html </VirtualHost> My other domain conf is: <VirtualHost myipaddress:80> ServerName www.domain2.com ServerAlias *.domain2.com domain2.com ServerAdmin hostmaster@domain2.com DocumentRoot /home/domain2/public_html </VirtualHost> The default site is disabled. The problem is that when accessing "domain2.com" from my browser, it always redirects to "www.domain1.com". It only works when I excplicitly access "www.domain2.com". I have also other domains like "domain1.net", "domain1.info" pointing to my server but at this moment are not configured either setup on Apache yet I can access from browser and always accessing to "www.domain1.com". By the way is there any possible configuration over Apache to handle IP only, I mean if I type "http://myipaddress/" I get the "www.domain1.com"... Arrgh.

    Read the article

  • 1 Google Analytics account or top-level domain + profiles for sub-domains vs. 1 account for each sub-domain

    - by Eric Nguyen
    We have the following websites An online magazine Singapore edition - sg.abc.com The same online magazine Malaysia edition - my.abc.com Forums around the same subjects as the online magazine but functions independently - forums.abc.com Classifieds site rather also around the same subjects but functions independently - directory.abc.com Each of the above websites currently has its own Google Analytics account. abc.com has a separate Google Analytics account too. sg.abc.com has the most traffic and generates most revenues Are there any practical benefits of merging all the above sub-domains to be under abc.com? I can think of more reliable analytics and consistency for sure. Are there more? cross-sales?

    Read the article

  • Visual Studio 2010 Short Cut Links!

    - by Dave Noderer
    This week Scott Cate came to South Florida and gave a great talk on his Visual Studio shortcuts and how he uses them. You can find a collection of short video’s he has done at: http://scottcate.com/tricks/ Also you might want to check out Sara Ford’s blog: http://blogs.msdn.com/saraford/default.aspx, she started doing a tip a day but has many more now. Scott covers many of these in the videos. And.. as with past releases, the languages team has provided PDF’s with a lot of keyboard shortcuts, this time for VB, C#, F# and C++. You can find downloads for all of these at the top of the FlaDotNet.com page and are included below: VB: http://www.fladotnet.com/downloads/VS2010VB.pdf C#: http://www.fladotnet.com/downloads/VS2010CSharp.pdf F#: http://www.fladotnet.com/downloads/VS2010FSharp.pdf C++: http://www.fladotnet.com/downloads/VS2010CPP.pdf Happy Keyboarding!!

    Read the article

  • Google Analytics - Unable to get GA Tracking

    - by Pure.Krome
    We've been using GA for a few years with no probs. About 2-3 weeks ago we tried to clean up some of our tracking and on one of our profiles, it's not working anymore (since oct 10.) First, some context then some GA Debugging code. 1. Context. We have the following setup: different root domains AND different sub-domains on one of the root domains. www.website.com www.website.com.au www.anotherWebsite.com foo.website.com baa.website.com So what we're doing is the following: each root domain and each sub-domain get their own tracking code. This way we can allow separate people (from outside our company) to access only their own data. Eg. a manager for foo.website.com can only see data related to that domain .. and see data on the other domains. Have a last account which is the SUM of all the domains. this is for us. so we can see total numbers. So to do this, we have two trackers that fire off, on the page. the individual accounts all work fine - they seem to be tracking data ok. the 'global' account is not working and this gives us the = Tracking Not Installed error. This has been going on since oct 10. So the wait 24/48/72 hours thing is waaaaay over. 2. GA Debug code. Installing GA Debug chrome extension gives the following output. I've tried to hide anything that could be considered secret. UA-XXXXX34-1 == Global account (which isn't working any more). UA-XXXXX34-11 == Specific account for www.website.com _gaq.push processing "_setAccount" for args: "[UA-XXXXX34-1]": ga_debug.js:18 _gaq.push processing "_setDomainName" for args: "[website.com]": ga_debug.js:18 _gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:18 _gaq.push processing "_trackPageview" for args: "[]": ga_debug.js:18 Track Pageview ga_debug.js:18 Tracking beacon sent! utmwv=--snipped-- Account ID : UA-XXXX234-1 Page Title : Some page title Host Name : www.website.com Page : / Referring URL : - Hit ID : 1923583969 Visitor ID : 785310647 Session Count : 51 Session Time - First : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Session Time - Last : Mon Oct 29 2012 11:41:46 GMT 1100 (AUS Eastern Summer Time) Session Time - Current : Mon Oct 29 2012 12:19:23 GMT 1100 (AUS Eastern Summer Time) Campaign Time : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Campaign Session : 1 Campaign Count : 1 Campaign Source : (direct) Campaign Medium : (none); Campaign Name : (direct) Language : en-gb Encoding : UTF-8 Flash Version : 11.4 r31 Java Enabled : true Screen Resolution : 1050x1680 Browser Size : 1033x861 Color Depth : 32-bit Ga.js Version : 5.3.7d Cachebuster : 1846514973 ga_debug.js:18 _gaq.push processing "_setAccount" for args: "[UA-XXXX234-11]": ga_debug.js:18 _gaq.push processing "_setDomainName" for args: "[website.com]": ga_debug.js:18 _gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:18 _gaq.push processing "_trackPageview" for args: "[]": ga_debug.js:18 Track Pageview ga_debug.js:18 Tracking beacon sent! utmwv=--snipped-- Account ID : UA-XXXX234-11 Page Title : SomePageTitle Host Name : www.website.com Page : / Referring URL : - Hit ID : 1923583969 Visitor ID : 785310647 Session Count : 51 Session Time - First : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Session Time - Last : Mon Oct 29 2012 11:41:46 GMT 1100 (AUS Eastern Summer Time) Session Time - Current : Mon Oct 29 2012 12:19:23 GMT 1100 (AUS Eastern Summer Time) Campaign Time : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Campaign Session : 1 Campaign Count : 1 Campaign Source : (direct) Campaign Medium : (none); Campaign Name : (direct) Language : en-gb Encoding : UTF-8 Flash Version : 11.4 r31 Java Enabled : true Screen Resolution : 1050x1680 Browser Size : 1033x861 Color Depth : 32-bit Ga.js Version : 5.3.7d Cachebuster : 1580443754 and this is the js code he have. BTW, it is inside a <head></head> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push( ['_setAccount', 'UA-XXXX234-1'], ['_setDomainName', 'website.com'], ['_setAllowLinker', true], ['_trackPageview'] ,['b._setAccount','UA-XXXX234-11'], ['b._setDomainName','website.com'], ['b._setAllowLinker',true], ['b._trackPageview'] ); (function () { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Finally, I've triple checked that the UA is the correct text. and yes, the global account is -1 and the specific domain is -11. Anyone have any suggestions to help?

    Read the article

  • NGINX Remove index.php /index.php/something/more/ to /something/more

    - by Gaston
    I'm trying to clean urls in NGINX using framework DooPHP. This = - http://example.com/index.php/something/more/ To This = - http://example.com/something/more/ I want to remove (clean url) the "index.php" from the url if someone try to enter in the first form. Like a permanent redirect. How to do this config on NGINX? Thanks. [Update: Actual nginx config] server { listen 80; server_name vip.example.com; rewrite ^/(.*) https://vip.example.com/$1 permanent; } server { listen 443; server_name vip.example.com; error_page 404 /vip.example.com/404.html; error_page 403 /vip.example.com/403.html; error_page 401 /vip.example.com/401.html; location /vip.example.com { root /sites/errors; } ssl on; ssl_certificate /etc/nginx/config/server.csr; ssl_certificate_key /etc/nginx/config/server.sky; if (!-e $request_filename){ rewrite /.* /index.php; } location / { auth_basic "example Team Access"; auth_basic_user_file config/htpasswd; root /sites/vip.example.com; index index.php; } location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /sites/vip.example.com$fastcgi_script_name; include fastcgi_params; fastcgi_param PATH_INFO $fastcgi_script_name; } }

    Read the article

  • htaccess to change url

    - by Guo Hong Lim
    I have the following code in my .htacess but it didn't work right. Is it because mod-rewrite is no "on", if so, how can i check? Options +FollowSymlinks RewriteEngine on RewriteRule ^(.*)\$ $1.php [nc] I wanted to rename my address, example: http://www.abc.com - http://www.abc.com http://abc.com - http://www.abc.com http://www.abc.com/123.html - http://www.abc.com/123 http://www.abc.com/12-12-12.html - http://www.abc.com/12-12-12 http://subdomain.abc.com/123.html - http://subdomain.abc.com/123 Basically removing the extension and ensuring that its www is intact.

    Read the article

< Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >