Search Results

Search found 104525 results on 4181 pages for 'code search engine'.

Page 66/4181 | < Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >

  • How would you react if someone told you your code is a mess?

    - by newbie
    I am a good programmer, or so I thought before. I always love to program. And I want to learn many things about programming to make me a better programmer. I studied programming for 1 year and now I am working as a programmer for almost 2 years. So in short, I have almost 3 years programming experience. Our team is composed of 5 programmers, and 4 of us are new, 1 has more than 3 year experience. We've been working for a program for almost a year now and nobody ever review my code and I was given a page to work with. We never had a code review and we are all new so we don't know what is a clean code looks like. I think programmers learn by themselves? We deployed our program to the program without thorough testing. Now it is tight and we need an approval and code review first before we make changes with the code. For the first time, someone reviews my code and he says it is a mess. I feel so sad and hurt. I really love programming and making them say something like that really hurts me. I really want to improve myself. But it seems like I'm not a genius programmer like in the movies. Can you give me advise on how to be better? Have you ever experience something criticizing your code and you feel really hurt? What do you do on those events.. Thank you

    Read the article

  • Copy-and-Pasted Test Code: How Bad is This?

    - by joshin4colours
    My current job is mostly writing GUI test code for various applications that we work on. However, I find that I tend to copy and paste a lot of code within tests. The reason for this is that the areas I'm testing tend to be similar enough to need repetition but not quite similar enough to encapsulate code into methods or objects. I find that when I try to use classes or methods more extensively, tests become more cumbersome to maintain and sometimes outright difficult to write in the first place. Instead, I usually copy a big chunk of test code from one section and paste it to another, and make any minor changes I need. I don't use more structured ways of coding, such as using more OO-principles or functions. Do other coders feel this way when writing test code? Obviously I want to follow DRY and YAGNI principles, but I find that test code (automated test code for GUI testing anyway) can make these principles tough to follow. Or do I just need more coding practice and a better overall system of doing things? EDIT: The tool I'm using is SilkTest, which is in a proprietary language called 4Test. As well, these tests are mostly for Windows desktop applications, but I also have tested web apps using this setup as well.

    Read the article

  • Can anyone point me to some open source directX rendering engines or frameworks? [on hold]

    - by Jim
    I'm completely new to graphics API programmming, but not at all new to the theory and principle operation of game engines and rendering engines. That being said, I want to do some experiments of rendering very dense geometry scenes in a basic rendering engine or game engine. I don't need a lot of bells and whistles. What I need is enough control that I can implement my own scene graph algorithms and control the rendering pipeline very specifically. My ideal candidate engine would be either a rendering engine or game engine with a modular design that might be ready to go out of the box but would be simple enough in case I need to rip out some of the guts in the rendering management and implement my own. It's a tough call because I'm right at the level where it's almost better to go from scratch, but there's no sense in having to build every single basic thing such as heirarchical transforms, etc. I just want to work with rendering optimization to push dense geometry for maximum FPS. Does anyone have a suggestion for an engine or basic framework to use? I requested DirectX in my title because I figured it would likely be better supported and less likely for me to run into some obscure less-documented problem. But OpenGL might be acceptable if the recommended framework was definitely better than my other options. EDIT: I should add that I really want GPU tessellation support (part of adding to the density of geometry detail).

    Read the article

  • Algoritms for "fuzzy matching" strings

    - by Alexey Romanov
    By fuzzy matching I don't mean similar strings by Levenshtein distance or something similar, but the way it's used in TextMate/Ido/Icicles: given a list of strings, find those which include all characters in the search string, but possibly with other characters between, preferring the best fit.

    Read the article

  • wssql always returning zero rows

    - by Lavinski
    I'm using the windows search 4.0 service (wssql) to find some files, it works fine on my computer but on our server which has two drives C: and D: always returns 0 rows when searching D: Also i'm not sure if it's related but cd d: goes back to c: in the command prompt.

    Read the article

  • Searching subversion history (full text)

    - by rjmunro
    Is there a way to perform a full text search of a subversion repository, including all the history? For example, I've written a feature that I used somewhere, but then it wasn't needed, so I svn rm'd the files, but now I need to find it again to use it for something else. The svn log probably says something like "removed unused stuff", and there's loads of checkins like that.

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata

    Read the article

  • Filtering records in app-engine (Java)

    - by Manjoor
    I have following code running perfectly. It filter records based on single parameter. public List<Orders> GetOrders(String email) { PersistenceManager pm = PMF.get().getPersistenceManager(); Query query = pm.newQuery(Orders.class); query.setFilter("Email == pEmail"); query.setOrdering("Id desc"); query.declareParameters("String pEmail"); query.setRange(0,50); return (List<Orders>) query.execute(email); } Now i want to filter on multiple parameters. sdate and edate is Start Date and End Date. In datastore it is saved as Date (not String). public List<Orders> GetOrders(String email,String icode,String sdate, String edate) { PersistenceManager pm = PMF.get().getPersistenceManager(); Query query = pm.newQuery(Orders.class); query.setFilter("Email == pEmail"); query.setFilter("ItemCode == pItemCode"); query.declareParameters("String pEmail"); query.declareParameters("String pItemCode"); .....//Set filter and declare other 2 parameters .....// ...... query.setRange(0,50); query.setOrdering("Id desc"); return (List<Orders>) query.execute(email,icode,sdate,edate); } Any clue?

    Read the article

  • App-Engine Parse a UrlFetch UTF-8 encoded stream

    - by Davidrd91
    I am trying to parse an XML from a URL using the xml.sax parser. I know there are other libraries to use but coming from Java this is the one I am most familiar with and seems the least complicated to me. The code I'm using to parse is as follows: parser = xml.sax.make_parser() handler = MangaHandler() parser.setContentHandler(handler) url = urlfetch.Fetch('http://www.mangapanda.com/alphabetical', allow_truncated = False, follow_redirects = False, deadline = False) xml.sax.parseString(url.content, handler) This returns a SaxException (invalid token) once the parser reaches the first & sign: SAXParseException: <unknown>:582:34: not well-formed (invalid token) Because urlfetch returns a string and not a stream I cannot use the parse() (which only works with streams) and am left to use parseString() instead. To see if parsing as a stream would fix this I tried: parser.parse(io.StringIO(url.content).encode('utf-8')) but this returns: TypeError: initial_value must be unicode or None, not str I have also tried to use the urllib2 libraries which do return a stream instead of urlfetch but the file is too large and is automatically truncated, leaving me with missing data. Any Sort of work-around for this would be greatly appreciated as I've spent days getting around one obstacle just to be stopped by another.

    Read the article

  • SQL Server Full-Text Search: Hung processes with MSSEARCH wait type

    - by CheeseInPosition
    We have a SQL Server 2005 SP2 machine running a large number of databases, all of which contain full-text catalogs. Whenever we try to drop one of these databases or rebuild a full-text index, the drop or rebuild process hangs indefinitely with a MSSEARCH wait type. The process can’t be killed, and a server reboot is required to get things running again. Based on a Microsoft forums post[1], it appears that the problem might be an improperly removed full-text catalog. Can anyone recommend a way to determine which catalog is causing the problem, without having to remove all of them? [1] [http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2681739&SiteID=1] “Yes we did have full text catalogues in the database, but since I had disabled full text search for the database, and disabled msftesql, I didn't suspect them. I got however an article from Microsoft support, showing me how I could test for catalogues not properly removed. So I discovered that there still existed an old catalogue, which I ,after and only after re-enabling full text search, were able to delete, since then my backup has worked”

    Read the article

  • Unable to get simple ruby on rails Search to work :/

    - by edu222
    I am new to RoR, any help would be greatly appreciated :) I have a basic scaffolding CRUD app to add customers. I am trying to search by first_name or last_name fields. The error that I am getting is: NoMethodError in Clientes#find You have a nil object when you didn't expect it! You might have expected an instance of Array. The error occurred while evaluating nil.each Extracted source (around line #9): 6: <th>Apellido</th> 7: </tr> 8: 9: <% for cliente in @clientes %> 10: <tr> 11: <td><%=h cliente.client_name %></td> 12: <td><%=h cliente.client_lastname %></td> Application Trace C:/Rails/clientes/app/views/clientes/find.html.erb:9:in `_run_erb_app47views47clientes47find46html46erb' My find function in controllers/clientes_controlee.rb is: # Find def find @cliente = Cliente.find(:all, :conditions=>["client_name = ? OR client_lastname = ?", params[:search_string], params[:search_string]]) end My views/layouts clientes.html.erb form code fragment is: <span style="text-align: right"> <% form_tag "/clientes/find" do %> <%= text_field_tag :search_string %> <%= submit_tag "Search" %> <% end %> </span> The search template I created in views/clientes/find.html.erb: <h1>Listing clientes for <%= params[:search_string] %></h1> <table> <tr> <th>Nombre</th> <th>Apellido</th> </tr> <% for cliente in @clientes %> <tr> <td><%=h cliente.client_name %></td> <td><%=h cliente.client_lastname %></td> <td><%= link_to 'Mostrar', cliente %></td> <td><%= link_to 'Editar', edit_cliente_path(cliente) %></td> <td><%= link_to 'Eliminar', cliente, :confirm =>'Estas Seguro de que desear eliminar a este te cliente?', :method => :delete %></td> </tr> <% end %> </table> <%= link_to 'Atras', clientes_path %

    Read the article

  • MySQL full text search with partial words

    - by Rob
    MySQL Full Text searching appears to be great and the best way to search in SQL. However, I seem to be stuck on the fact that it won't search partial words. For instance if I have an article titled "MySQL Tutorial" and search for "MySQL", it won't find it. Having done some searching I found various references to support for this coming in MySQL 4 (i'm using 5.1.40). I've tried using "MySQL" and "%MySQL%", but neither works (one link I found suggested it was stars but you could only do it at the end or the beginning not both). Here's my table structure and my query, if someone could tell me where i'm going wrong that would be great. I'm assuming partial word matching is built in somehow. CREATE TABLE IF NOT EXISTS `articles` ( `article_id` smallint(5) unsigned NOT NULL AUTO_INCREMENT, `article_name` varchar(64) NOT NULL, `article_desc` text NOT NULL, `article_link` varchar(128) NOT NULL, `article_hits` int(11) NOT NULL, `article_user_hits` int(7) unsigned NOT NULL DEFAULT '0', `article_guest_hits` int(10) unsigned NOT NULL DEFAULT '0', `article_rating` decimal(4,2) NOT NULL DEFAULT '0.00', `article_site_id` smallint(5) unsigned NOT NULL DEFAULT '0', `article_time_added` int(10) unsigned NOT NULL, `article_discussion_id` smallint(5) unsigned NOT NULL DEFAULT '0', `article_source_type` varchar(12) NOT NULL, `article_source_value` varchar(12) NOT NULL, PRIMARY KEY (`article_id`), FULLTEXT KEY `article_name` (`article_name`,`article_desc`,`article_link`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=7 ; INSERT INTO `articles` VALUES (1, 'MySQL Tutorial', 'Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.', 'http://www.domain.com/', 6, 3, 1, '1.50', 1, 1269702050, 1, '0', '0'), (2, 'How To Use MySQL Well', 'Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.', 'http://www.domain.com/', 1, 2, 0, '3.00', 1, 1269702050, 1, '0', '0'), (3, 'Optimizing MySQL', 'Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.', 'http://www.domain.com/', 0, 1, 0, '3.00', 1, 1269702050, 1, '0', '0'), (4, '1001 MySQL Tricks', 'Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.', 'http://www.domain.com/', 0, 1, 0, '3.00', 1, 1269702050, 1, '0', '0'), (5, 'MySQL vs. YourSQL', 'Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.', 'http://www.domain.com/', 0, 2, 0, '3.00', 1, 1269702050, 1, '0', '0'), (6, 'MySQL Security', 'Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.', 'http://www.domain.com/', 0, 2, 0, '3.00', 1, 1269702050, 1, '0', '0'); SELECT count(a.article_id) FROM articles a WHERE MATCH (a.article_name, a.article_desc, a.article_link) AGAINST ('mysql') GROUP BY a.article_id ORDER BY a.article_time_added ASC The prefix is used as it comes from a function that sometimes adds additional joins. As you can see a search for MySQL should return a count of 6, but unfortunately it doesn't.

    Read the article

  • Best way to retrieve certain field of all documents returned by a lucen search

    - by Philipp
    Hi, I was wondering what the best way is to retrieve a certain field of all documents returned by a Searcher of Lucene. Background: each document has a date field (written on) and I would like to show a timeline of all found documents, so I need to extract the date (day) field of all the documents I find with the search. I currently retrieve every document using Searcher.doc(int, FieldSelector) having the selector only retrieve the certain field. I have indexed 250k documents, the search itself takes no time and returns about 10k document ids. Retrieving those however, takes 20+ seconds. What can I do to speed things up, but still get all the values I need. Thx in advance Philipp

    Read the article

  • search results filter effects - flex

    - by Adam
    I've created a search with a couple of comboboxes that allow users to filter their search results. The results are currently using a TileList & itemRenderer to display, and now I'd like to add an animation effect when the user filters their results. I know that you can use the itemsChangeEffect to create an animation effect when the user drags and moves result itmes. So I'd like to know if there's a way to create a similar effect triggered by the filtering on the comboboxes? Thanks.

    Read the article

  • Optimizing encrypted column search

    - by Sung Meister
    I have a table called,tblClient with an encrypted column called SSN. Due to company policy, we encrypted SSN using a symmetric key (chosen over asymmetric key due to performance reasons) using a password. Here is a partial LIKE search on SSN declare @SSN varchar(11) set @SSN = '111-22-%' open symmetric key SSN_KEY decrypt by password = 'secret' select Client_ID from tblClient (nolock) where convert(nvarchar(11), DECRYPTBYKEY(SSN)) like @SSN close symmetric key SSN_KEY Before encryption, searching thru 150,000 records took less than 1 second. but with the mix of decryption, the same search takes around 5 seconds. What strategy can I apply to try to optimize searching thru encrypted column?

    Read the article

  • Lucene search taking TOOO long.

    - by Josh Handel
    I;m using Lucene.net (2.9.2.2) on a (currently) 70Gig index.. I can do a fairly complicated search and get all the document IDs back in 1 ~ 2 seconds.. But to actually load up all the hits (about 700 thousand in my test queries) takes 5+ minutes. We aren't using lucene for UI, this is a datastore between processes where we have hundreds of millions of pre-cached data elements, and the part I am working on exports a few specific fields from each found document. (ergo, pagination doesn't make since as this is an export between processes). My question is what is the best way to get all of the documents in a search result? currently I am using a custom collector that does a get on the document (with a MapFieldSelector) as its collecting.. I've also tried iterating through the list after the collector has finished.. but that was even worse. I'm open to ideas :-). Thanks in advance.

    Read the article

  • Eclipse Search Only Specific Folders

    - by Craig
    Hello, I already saw the answers for a question almost identical to this: http://stackoverflow.com/questions/443169/eclipse-exclude-folders-from-search. However I am looking for a resolution that would allow me to say look at 10 of my 200 folders and those 10 change all the time. Is there a way I can search through just 1 folder and avoid any other folders that are not inside it? I don't want to create a different project as I am using SVN and I have had cases with mxml files and other files that adding a file that we moved from one project to another caused problems for other developers.

    Read the article

  • Search select statement

    - by Nana
    I am creating a page which would have different field for the user to search from. e.g. search by: Grade: -dropdownlist1- Student name: -dropdownlist2- Student ID: -dropdownlist3- Lessons: -dropdownlist4- Year: -dropdownlist5- How do I write the select statement for this? Each dropdownlist would need a select statement which would extract out different data from the database. But, I want to write ONE select statement which can dynamically choose the dropdownlist options. Instead of writing many many select statement. Lets say; Grade: -dropdownlist1- ; default value(all) Student name: -dropdownlist2-; default value(all) Student ID: -dropdownlist3-; 0-100 is choosen Lessons: -dropdownlist4-; A-C is choosen Year: -dropdownlist5-; 2009 is choosen

    Read the article

  • PostgreSql XML Text search

    - by cro
    I have a text column in a table. We store XML in this column. Now I want to search for tags and values Example data: Citi Bank ..... ..... / I would like to run the following query: select * from xxxx where to_tsvector('english',xml_column) @@ to_tsquery('Citi Bank') This works fine but it also works for tags like name1 or no tag. How do I have to setup my search in order for this to work so I get an exact match for the tag and value ?

    Read the article

  • appcfg.py upload_data entity kind problem

    - by Dingo
    Hi, I am developing application on app-engine-path and I would like to upload some data to datastore. For example I have a model models/places.py: class Place(db.Model): name = db.StringProperty() longitude = db.FloatProperty() latitude = db.FloatProperty() If I save this in view, kind() of this entity is "models_place". All is ok, Place.all() in view work fine. But: If I upload some next row using appcfg.py upload_data, the kind() of this entities is Place. loader.py look like this: import datetime, os, sys from google.appengine.ext import db from google.appengine.tools import bulkloader libs_path = os.path.join("/home/martin/myproject/src/") if libs_path not in sys.path: sys.path.insert(0, libs_path) from models import places class AlbumLoader(bulkloader.Loader): def __init__(self): bulkloader.Loader.__init__(self, 'Place', [('name', lambda x: x.decode('utf-8')), ('longitude', float), ('latitude', float), ]) loaders = [AlbumLoader] and command for uploading: python /usr/local/google_appengine/appcfg.py upload_data --config_file=places_loader.py --kind=models_place --filename=data/places.csv --url=http://localhost:8000/remote_api /home/martin/myproject/src/

    Read the article

  • SQL Server Full Text Search Leading Wildcard

    - by aherrick
    After taking a look at this SO question and doing my own research, it appears that you cannot have a leading wildcard while using full text search. So in the most simple example, if I have a Table with 1 column like below: TABLE1 coin coinage undercoin select COLUMN1 from TABLE1 where COLUMN1 LIKE '%coin%' Would get me the results I want. How can I get the exact same results with FULL TEXT SEARCH enabled on the column? The following two queries return the exact same data, which is not exactly what I want. SELECT COLUMN1 FROM TABLE1 WHERE CONTAINS(COLUMN1, '"coin*"') SELECT COLUMN1 FROM TABLE1 WHERE CONTAINS(COLUMN1, '"*coin*"')

    Read the article

< Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >