Search Results

Search found 13652 results on 547 pages for 'full'.

Page 32/547 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • .htaccess - RewriteRules working, but browser address bar displaying full (unfriendly) URL

    - by axol
    Hey there, Haven't been able to find a solution to this around the net or these forums - apologise if I've missed something! My .htaccess RewriteRules are working well - have search-engine and user -friendly links in my web pages, and unfriendly database URLs running hidden in the background. Except when I added a RewriteRule to add "www." to the front of URLs if the user didn't enter it - to ensure only one appears in search engines. Here's what now happening, and I can't figure out why! My friendly URL structure for content is like this, and the database query string uses the first "importantword": www.example.com/importantword-nonimportantword/ .htaccess snippet: Options +FollowSymLinks Options -Indexes RewriteEngine on RewriteOptions MaxRedirects=10 RewriteBase / RewriteRule ^/$ index.php [L] RewriteRule ^(.*)-(.*)/overview/$ detail.php?categoryID=$1 [L] RewriteCond %{HTTP_HOST} !^www.example.com$ RewriteRule ^(.*)$ http://www.example.com/$1 [L] What's happening since I added the last 2 lines: CASE 1: user types (or clicks) www.example.com/honda-vehicle/overview/ - Works correctly - They are taken to the correct page and the browser URL bar says: www.example.com/honda-vehicles/overview/ CASE 2: user types example.com - Works correctly - They are taken to www.example.com and the browser URL bar says: www.example.com CASE 3: user types (or clicks) example.com/honda-vehicles/overview/ i.e. without the prefix "www" - Does NOT work correctly - They are taken to the right page, but the browser URL bar displays the unfriendly URL: www.example.com/detail.php?categoryID=honda I figure there's some issue with the order of the RewriteRules, but it's doing my head in trying to logically step through it and figure it out! Any assistance at all or pointers would be most appreciated!

    Read the article

  • How to use a specific PDF IFilter

    - by dthrasher
    I'm trying to extract text from PDF files using an iFilter. The Adobe PDF iFilter that is distributed with Adobe Reader is awful, returning HRESULT E_FAIL messages for many PDF documents. The FoxIt PDF IFilter works beautifully on virtually all of the PDFs I've been using for testing. The problem is that every time the Adobe Updater runs, it replaces the awesome FoxIt IFilter with the crappy Adobe IFilter. I've been using the LoadIFilter method to get the registered IFilter for PDF files. Is there a way to force the Win32 API to load the FoxIt IFilter instead of the Adobe IFilter? NOTE: This question about determining which IFilters are installed asks a related -- but not identical -- question.

    Read the article

  • JMeter to grab full querystring into a variable for future use

    - by jfbauer
    Someone provided me the regex to parse out a query string: (?<=\?)[^?]+$ I am trying to use that in JMeter with no luck (although I am successful in pulling out individual query string parameter values based on various example postings on the web). I created a regular expression extractor called "Grab QueryString". I selected the URL response field to check. For the reference name, I typed "myQueryString". For the regular expression, I entered your text. For template, I entered $1$ Match no = 1 Default Value = ERROR Unfortunately, "myQueryString" is getting populated with ERROR and not the URL query string as hoped when I try and use it as a parameter in a future GET. Thus, I see this in the "View Results Tree": https:/www.website.com/folder/page.aspx?ERROR Instead of: https:/www.website.com/folder/page.aspx?jfhjHSDjgdjhsjhsdhjSJHWed Did I do something wrong? Anyone have any suggestions?

    Read the article

  • Full JSON-RPC specifications

    - by Artyom
    Hello, I'm going to implement JSON-PRC web service. I need specifications for this. So far I had found only one resource that can be called as real specifications: JSON-RPC 1.0 http://json-rpc.org/wiki/specification Proposal of JSON-PRC 2.0: http://groups.google.com/group/json-rpc/web/json-rpc-2-0 (why is it on google groups?) However I've seen that JavaScript frameworks like Dojo actively use JSON-RPC SMD Service Mapping Description proposal But it requires JSON Schema specifications, but it redirects to incorrect URL as reference. So far I had found following: http://tools.ietf.org/html/draft-zyp-json-schema-02 And it is still draft... Can anybody point me to spome actual specifications... At least something official updated? Because it looks like that implementing JSON-RPC 1.0 and 2.0 would not be enought, at least for frameworks like Dojo. Or am I wrong? Questions: Is it enough to implement JSON-RPC 1.0 specifications and 2.0 draft to be on safe side, would this work for most JSON-RPC clients? If I should implement SMD, or it is recommended can somebody point to official specifications of Json Schema and Service Mapping Description or links I found are really "specifications?" Note: do not suggest existing JSON-RPC service implementations.

    Read the article

  • Sphinx PHP search

    - by James
    I'm doing a Sphinx search but turning up some really weird results. Any help is appreciated. So for example if I type "50", I get: 50 Cent 50 Lions 50 Foot Wave, etc. This is great, but when I search "50 Ce", I get: Ryczace Dwudziestki Spisek Bernhard Gal Cowabunga Go-Go And other crazy results. Also when I search for "50 Cent", the correct result is at the top, but then random results below. Any ideas why? PHP code: $query = $_GET['query']; if (!empty($query)) { $sphinx->SetMatchMode(SPH_MATCH_ALL); $sphinx->AddQuery($query, 'artists'); $sphinx->AddQuery($query, 'variations'); $sphinx->SetFilter('name', array(3)); $sphinx->SetLimits(0, 10); $result = $sphinx->RunQueries(); echo '<pre>'; switch ($result) { case false: echo 'Query failed: ' . $sphinx->GetLastError() . "\n"; break; default: if ($sphinx->GetLastWarning()) { echo 'WARNING: ' . $sphinx->GetLastWarning() . "\n"; } if (is_array($result[0]['matches']) && count($result[0]['matches'])) { foreach ($result[0]['matches'] as $value => $info) { $artist = artistDetails($value); echo $artist['name'] . "\n"; } } } } Sphinx Index and Source: source artists { type = mysql sql_host = localhost sql_user = user sql_pass = pass sql_db = db sql_port = 3300 sql_query = \ SELECT \ id, name \ FROM artists; #UNIX_TIMESTAMP(time) #sql_attr_uint = group_id #sql_attr_timestamp = time sql_query_info = SELECT id,name FROM artists WHERE id=$id } index artists { source = artists path = /var/sphinx/artists docinfo = extern charset_type = utf-8 }

    Read the article

  • Indexing CSV file contents in Python

    - by Hossein
    Hi, I have a very large CSV file contaning only two fields (id,url). I want to do some indexing on the url field with python, I know that there are some tools like Whoosh or Pylucene. but I can't get the examples to work. can someone help me with this?

    Read the article

  • SQL Server 2008 ContainsTable, CTE, and Paging

    - by David Murdoch
    I'd like to perform efficient paging using containstable. The following query selects the top 10 ranked results from my database using containstable when searching for a name (first or last) that begins with "Joh". DECLARE @Limit int; SET @Limit = 10; SELECT TOP @Limit c.ChildID, c.PersonID, c.DOB, c.Gender FROM [Person].[vFullName] AS v INNER JOIN CONTAINSTABLE( [Person].[vFullName], (FullName), IS ABOUT ( "Joh*" WEIGHT (.4), "Joh" WEIGHT (.6)) ) AS k3 ON v.PersonID = k3.[KEY] JOIN [Child].[Details] c ON c.PersonID = v.PersonID JOIN [Person].[Details] p ON p.PersonID = c.PersonID ORDER BY k3.RANK DESC, FullName ASC, p.Active DESC, c.ChildID ASC I'd like to combine it with the following CTE which returns the 10th-20th results ordered by ChildID (the primary key): DECLARE @Start int; DECLARE @Limit int; SET @Start = 10; SET @Limit = 10; WITH ChildEntities AS ( SELECT ROW_NUMBER() OVER (ORDER BY ChildID) AS Row, ChildID FROM Child.Details ) SELECT c.ChildID, c.PersonID, c.DOB, c.Gender FROM ChildEntities cte INNER JOIN Child.Details c ON cte.ChildID = c.ChildID WHERE cte.Row BETWEEN @Start+1 AND @Start+@Limit ORDER BY cte.Row ASC

    Read the article

  • Submit HTML form with GET method with full action path

    - by Manny Calavera
    Hello. I am trying to submit a form with method GET and action "index.php?id=3". The problem is that it jumps to the url specified but it cuts off "?id=3" part. I need that so I can identify an user. Any ideas on how I could submit the whole url ? I don't want to change this method, by design is much complex than I told you here. It's a simple version. Any ideas ? Thanks.

    Read the article

  • Full page border | CSS

    - by Wayne
    Isn't there a CSS way of having the page to get a border around the page, even if the content was not big enough for the page to scroll and there's still a border around the page. I think I remember I saw one CSS method before was like something something, I don't know, do you know? lol If you know, many thanks :)

    Read the article

  • Prevent full table scan for query with multiple where clauses

    - by Dave Jarvis
    A while ago I posted a message about optimizing a query in MySQL. I have since ported the data and query to PostgreSQL, but now PostgreSQL has the same problem. The solution in MySQL was to force the optimizer to not optimize using STRAIGHT_JOIN. PostgreSQL offers no such option. Here is the explain: Here is the query: SELECT avg(d.amount) AS amount, y.year FROM station s, station_district sd, year_ref y, month_ref m, daily d LEFT JOIN city c ON c.id = 10663 WHERE -- Find all the stations within a specific unit radius ... -- 6371.009 * SQRT( POW(RADIANS(c.latitude_decimal - s.latitude_decimal), 2) + (COS(RADIANS(c.latitude_decimal + s.latitude_decimal) / 2) * POW(RADIANS(c.longitude_decimal - s.longitude_decimal), 2)) ) <= 50 AND -- Ignore stations outside the given elevations -- s.elevation BETWEEN 0 AND 2000 AND sd.id = s.station_district_id AND -- Gather all known years for that station ... -- y.station_district_id = sd.id AND -- The data before 1900 is shaky; insufficient after 2009. -- y.year BETWEEN 1980 AND 2000 AND -- Filtered by all known months ... -- m.year_ref_id = y.id AND m.month = 12 AND -- Whittled down by category ... -- m.category_id = '001' AND -- Into the valid daily climate data. -- m.id = d.month_ref_id AND d.daily_flag_id <> 'M' GROUP BY y.year It appears as though PostgreSQL is looking at the DAILY table first, which is simply not the right way to go about this query as there are nearly 300 million rows. How do I force PostgreSQL to start at the CITY table? Thank you!

    Read the article

  • Rails add a param to the end of a search query

    - by bob
    Hello, I am trying to implement a sort on the returned results of a search with rails. Filter by: 'recent' % | 'lowest_price' % | 'highest_price' % here are my links. I would like to add a filter to the end of my search query. So for example, I submit a search and I get back results and in my url I have http://localhost:3000/junks?search=handbag&condition=&category=&main_submit=Go! I would then like the user to be able to click one of the above links and have the page come back with a new query (i have this set up in the controller like such) con_id = params[:condition] if params[:category].nil? || params[:category].empty? cat_id = params[:category] results = Junk.search params[:search], :with => { :category_id => cat_id, :condition_id => con_id } case params[:filter] when "lowest_price" @junks = results.lowest_price.paginate :page => params[:page], :per_page => 12 when "highest_price" @junks = results.highest_price.paginate :page => params[:page], :per_page => 12 else @junks = results.paginate :page => params[:page], :per_page => 12 end else I would like the user to be able to click on of the above links and append a filter param to the end of the search query so that my controller can pick it up and call the correct database queries as seen in the case statement above. I'm guessing the url will look like this http://localhost:3000/junks?search=handbag&condition=&category=&main_submit=Go!&filter="lowest_price" How can I do this? Is there a better way?

    Read the article

  • Can AVAudioSession do full duplex?

    - by Eric Christensen
    It would seem like it should be able to, but the following breakout test code can't do both: //play a file: NSArray *pathsArray = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [pathsArray objectAtIndex:0]; NSString* playFilePath=[documentsDirectory stringByAppendingPathComponent:@"testplayfile.wav"]; AVAudioPlayer *tempplayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:playFilePath] error:nil]; [tempplayer prepareToPlay]; [tempplayer play]; //and record a file: NSString* recFilePath=[documentsDirectory stringByAppendingPathComponent:@"testrecordfile.wav"]; AVAudioRecording *soundrecording = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:recFilePath] settings:nil error:nil]; [soundrecording prepareToRecord]; [soundrecording record]; This is the minimum I can think of to individually play one file and record another. And this works just fine in the simulator. I can play back a file and record at the same time. But it doesn't work on the iphone itself. If I comment out either function, the other performs fine. The playback plays fine either alone or with both, if it's first. If I comment out the playback, the record records fine. (There's additional code to stop the recording not shown here.) So each works fine, but not together. I know audioQueue has a setting to allow both, but I don't see an analogue for AVAudioSessions. Any idea if it's possible, and if so, what I need to add? Thanks!

    Read the article

  • SqlBulkCopy is slow, doesn't utilize full network speed

    - by Alex
    Hi, for that past couple of weeks I have been creating generic script that is able to copy databases. The goal is to be able to specify any database on some server and copy it to some other location, and it should only copy the specified content. The exact content to be copied over is specified in a configuration file. This script is going to be used on some 10 different databases and run weekly. And in the end we are copying only about 3%-20% of databases which are as large as 500GB. I have been using the SMO assemblies to achieve this. This is my first time working with SMO and it took a while to create generic way to copy the schema objects, filegroups ...etc. (Actually helped find some bad stored procs). Overall I have a working script which is lacking on performance (and at times times out) and was hoping you guys would be able to help. When executing the WriteToServer command to copy large amount of data ( 6GB) it reaches my timeout period of 1hr. Here is the core code for copying table data. The script is written in PowerShell. $query = ("SELECT * FROM $selectedTable " + $global:selectiveTables.Get_Item($selectedTable)).Trim() Write-LogOutput "Copying $selectedTable : '$query'" $cmd = New-Object Data.SqlClient.SqlCommand -argumentList $query, $source $cmd.CommandTimeout = 120; $bulkData = ([Data.SqlClient.SqlBulkCopy]$destination) $bulkData.DestinationTableName = $selectedTable; $bulkData.BulkCopyTimeout = $global:tableCopyDataTimeout # = 3600 $reader = $cmd.ExecuteReader(); $bulkData.WriteToServer($reader); # Takes forever here on large tables The source and target databases are located on different servers so I kept track of the network speed as well. The network utilization never went over 1% which was quite surprising to me. But when I just transfer some large files between the servers, the network utilization spikes up to 10%. I have tried setting the $bulkData.BatchSize to 5000 but nothing really changed. Increasing the BulkCopyTimeout to an even greater amount would only solve the timeout. I really would like to know why the network is not being used fully. Anyone else had this problem? Any suggestions on networking or bulk copy will be appreciated. And please let me know if you need more information. Thanks. UPDATE I have tweaked several options that increase the performance of SqlBulkCopy, such as setting the transaction logging to simple and providing a table lock to SqlBulkCopy instead of the default row lock. Also some tables are better optimized for certain batch sizes. Overall, the duration of the copy was decreased by some 15%. And what we will do is execute the copy of each database simultaneously on different servers. But I am still having a timeout issue when copying one of the databases. When copying one of the larger databases, there is a table for which I consistently get the following exception: System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. It is thrown about 16 after it starts copying the table which is no where near my BulkCopyTimeout. Even though I get the exception that table is fully copied in the end. Also, if I truncate that table and restart my process for that table only, the tables is copied over without any issues. But going through the process of copying that entire database fails always for that one table. I have tried executing the entire process and reseting the connection before copying that faulty table, but it still errored out. My SqlBulkCopy and Reader are closed after each table. Any suggestions as to what else could be causing the script to fail at the point each time?

    Read the article

  • Silverlight 4 - elevated permission *inside* the browser

    - by Doug
    I know Silverlight 4 can handle elevated permissions outside the browser. Is there a way to accomplish this inside the browser? I need to make a folder/file upload manager that gives a better user experience than the standard , and I'd like to implement it in Silverlight. I know Java has an option to gain elevated permissions, but you have to attach a signed certificate to your app. Does Silverlight 4 have a similar option - to gain elevated permissions by attaching a signed certificate (after warning the user, of course)? -Doug

    Read the article

  • jquery full calendar

    - by madrat
    hello everybody. I am new to fullCalendar and i do not kno how to use gotoDate method. What i have is a tabbed page with 12 tabs (every tab is a month of the year). What i need is when i load jquery calendar, foreach tab, the calendar to switch to the month that is assigned. (e.g when i click and load page for January the calendar shows days for january, and so on). thanks for your help.

    Read the article

  • Timeout reading verity collection - CF8

    - by Gary
    For a long time now I've been having a problem with using the verity search service bundled with ColdFusion 8. The issue is with timeout errors occurring when perfoming any operation on a collection. It's intermittent, and usually occurs after a few operations have been successfully performed. For instance: If I'm adding records to a collection the first, say 15 records, will go through with no problems, but all subsequent records will timeout until the service is rebooted. I'm on a shared server, Windows 2008, 64bit as far as I know. The error I receive is: "An error occurred while performing an operation in the Search Engine library. Error reading collection information.: com.verity.api.administration.ConfigurationException: java.io.IOException: Read timed out" Having spoken to my hosting company, and after doing some research, it's been suggested that the number of collections on a server may cause this issue. I've reduced the amount of collections I use, and there are currently 39 collections on the server. As I'm on a shared server, I have no control over how many collections other customers use, however I've read that the limit is 128 collections, so I don't see why 39 should cause it to become unusable. The collections aren't big, there's maybe around 5,000 records between all of them. Any ideas?

    Read the article

  • Using IVMRWindowlessControl to display video in a Winforms Control and allow for full screen toggle

    - by Jonathan Websdale
    I've recently switched from using the IVideoWindow interface to IVMRWindowlessControl in my custom Winforms control to display video. The reason for this was to allow zoom capabilities on the video within the control. However in switching over, I've found that the FullScreen mode from IVideoWindow is not available and I am currently trying to replicate this using the SetVideoWindow() method. I'm finding that I size the video in my control to be at the same resolution as the screen however I can't get the control to position itself to the top/left of the screen and become the top most window. Any ideas on how to achieve this since the IVideoWindow::put_FullScreenMode just did it all for you?

    Read the article

  • How to optimize Core Data query for full text search

    - by dk
    Can I optimize a Core Data query when searching for matching words in a text? (This question also pertains to the wisdom of custom SQL versus Core Data on an iPhone.) I'm working on a new (iPhone) app that is a handheld reference tool for a scientific database. The main interface is a standard searchable table view and I want as-you-type response as the user types new words. Words matches must be prefixes of words in the text. The text is composed of 100,000s of words. In my prototype I coded SQL directly. I created a separate "words" table containing every word in the text fields of the main entity. I indexed words and performed searches along the lines of SELECT id, * FROM textTable JOIN (SELECT DISTINCT textTableId FROM words WHERE word BETWEEN 'foo' AND 'fooz' ) ON id=textTableId LIMIT 50 This runs very fast. Using an IN would probably work just as well, i.e. SELECT * FROM textTable WHERE id IN (SELECT textTableId FROM words WHERE word BETWEEN 'foo' AND 'fooz' ) LIMIT 50 The LIMIT is crucial and allows me to display results quickly. I notify the user that there are too many to display if the limit is reached. This is kludgy. I've spent the last several days pondering the advantages of moving to Core Data, but I worry about the lack of control in the schema, indexing, and querying for an important query. Theoretically an NSPredicate of textField MATCHES '.*\bfoo.*' would just work, but I'm sure it will be slow. This sort of text search seems so common that I wonder what is the usual attack? Would you create a words entity as I did above and use a predicate of "word BEGINSWITH 'foo'"? Will that work as fast as my prototype? Will Core Data automatically create the right indexes? I can't find any explicit means of advising the persistent store about indexes. I see some nice advantages of Core Data in my iPhone app. The faulting and other memory considerations allow for efficient database retrievals for tableview queries without setting arbitrary limits. The object graph management allows me to easily traverse entities without writing lots of SQL. Migration features will be nice in the future. On the other hand, in a limited resource environment (iPhone) I worry that an automatically generated database will be bloated with metadata, unnecessary inverse relationships, inefficient attribute datatypes, etc. Should I dive in or proceed with caution?

    Read the article

  • Cant full read querystring data in mvc4 application using a class array as input to action

    - by Anders Lindén
    I have made a mvc4 application and I have a Controller that outputs a png file like this: using System.Drawing; using System.Drawing.Drawing2D; using System.Drawing.Imaging; using System.Drawing.Text; using System.IO; using System.Web.Mvc; using Foobar.Classes; namespace Foobar.Controllers { public class ImageController : Controller { public ActionResult Index(Label[] labels) { var bmp = new Bitmap(400, 300); var pen = new Pen(Color.Black); var font = new Font("arial", 20); var g = Graphics.FromImage(bmp); g.SmoothingMode = SmoothingMode.HighQuality; g.TextRenderingHint = TextRenderingHint.AntiAliasGridFit; if (labels != null) { g.DrawString("" + labels.Length, font, pen.Brush, 20, 20); if (labels.Length > 0) { g.DrawString("" + labels[0].label, font, pen.Brush, 20, 40); } } var stream = new MemoryStream(); bmp.Save(stream, ImageFormat.Png); stream.Position = 0; return File(stream, "image/png"); } } } The Label class looks like this: using System; using System.Collections.Generic; using System.Linq; using System.Web; namespace Foobar.Classes { public class Label { public string label { get; set; } public int fontsize { get; set; } } } When I run my controller having this in the url: http://localhost:57775/image?labels[0][label]=Text+rad+1&labels[0][fontsize]=5&labels[1][fontsize]=5&labels[2][fontsize]=5 I get the correct amount of labels, so the image will show 3. But the instances of Label will not get its data members fill in. I have also tried to do this using plain variables (not properties). If they were filled in, the image would actually show "3" and "Text rad 1". So what do I put in the class "Label" to have the properties right? Should there be some kind of annotation? Where do I read about this?

    Read the article

  • Couple o' quick questions on Apache Lucene

    - by Doug
    -- I don't want to start any religious wars, but a quick google search indicates that Apache Lucene is the preferred open source tool for indexing and searching. Are there others? -- What file format does Lucene use to store its index file(s)? Thank is advance. Doug

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >