Search Results

Search found 38640 results on 1546 pages for 'full table scan'.

Page 399/1546 | < Previous Page | 395 396 397 398 399 400 401 402 403 404 405 406  | Next Page >

  • GetOleDbSchemaTable Foreign Keys on Sql Server 2005

    - by haxelit
    I'm trying to get the Foreign Keys for a table in my SQL Server 2005 database. I'm using the GetOleDbSchemaTable function right now: DataTable schemaTable = connection.GetOleDbSchemaTable( OleDbSchemaGuid.Foreign_Keys, new object[] { null, null, null, "TABLE" }); This pulls back the right foreign keys, the only problem is that the UpdateRule and DeleteRule are set to "No Action". If I browse to the same table in SSMS I can see that my DeleteRule is "Set NULL". Does the GetOleDbSchemaTable function not return the proper foreign key rules ? Has any one else ran into this problem ?

    Read the article

  • Optimizing a Soundex Query for finding similar names

    - by xkingpin
    My application will offer a list of suggestions for English names that "sound like" a given typed name. The query will need to be optimized and return results as quick as possible. Which option would be most optimal for returning results quickly. (Or your own suggestion if you have one) A. Generate the Soundex Hash and store it in the "Names" table then do something like the following: (This saves generating the soundex hash for at least every row in my db per query right?) select name from names where NameSoundex = Soundex('Ann') B. Use the Difference function (This must generate the soundex for every name in the table?) select name from names where Difference(name, 'Ann') = 3 C. Simple comparison select name from names where Soundex(name) = Soundex('Ann') Option A seems like to me it would be the fastest to return results because it only generates the Soundex for one string then compares to an indexed column "NameSoundex" Option B should give more results than option A because the name does not have to be an exact match of the soundex, but could be slower Assuming my table could contain millions of rows, what would yield the best results?

    Read the article

  • PrincipalPermission - roles seperate from permissions

    - by Leblanc Meneses
    I've been using PrincipalPermission for a while in wcf services. [PrincipalPermission(SecurityAction.Demand, Role = SecurityRoles.CanManageUsers)] although now i have a requirement to simplify roles by business unit. - currently aspnet_roles has fine grained can* permissions. Here is my approach and wanted to see if anyone can provide feedback, code review before i implement my suggestion. 1) aspnet_roles - business unit role 2) create permission table and Role_Permission table and User_Permission table (many to many) 3) create custom CodeAccessSecurityAttribute + that looks at new tables [CustomPermissionCheck(Security.Demand, HasPermission="can*")] first iteration i'll statically new the dependent repository.. ideally i would like an aop style attribute that has repository injected IPermissionRepository.HasPermission(...); If i approach new aop way i probably will stop inheriting from CodeAccessSecurityAttribute -- what do the security guys have to say about this? has anyone else solved this, is there something in the framework that i've missed?

    Read the article

  • Custom validation works in development but not in unit test

    - by Geolev
    I want to validate that at least one of two columns have a value in my model. I found somewhere on the web that I could create a custom validator as follows: # Check for the presence of one or another field: # :validates_presence_of_at_least_one_field :last_name, :company_name - would require either last_name or company_name to be filled in # also works with arrays # :validates_presence_of_at_least_one_field :email, [:name, :address, :city, :state] - would require email or a mailing type address module ActiveRecord module Validations module ClassMethods def validates_presence_of_at_least_one_field(*attr_names) msg = attr_names.collect {|a| a.is_a?(Array) ? " ( #{a.join(", ")} ) " : a.to_s}.join(", ") + "can't all be blank. At least one field must be filled in." configuration = { :on => :save, :message => msg } configuration.update(attr_names.extract_options!) send(validation_method(configuration[:on]), configuration) do |record| found = false attr_names.each do |a| a = [a] unless a.is_a?(Array) found = true a.each do |attr| value = record.respond_to?(attr.to_s) ? record.send(attr.to_s) : record[attr.to_s] found = !value.blank? end break if found end record.errors.add_to_base(configuration[:message]) unless found end end end end end I put this in a file called lib/acs_validator.rb in my project and added "require 'acs_validator'" to my environment.rb. This does exactly what I want. It works perfectly when I manually test it in the development environment but when I write a unit test it breaks my test environment. This is my unit test: require 'test_helper' class CustomerTest < ActiveSupport::TestCase # Replace this with your real tests. test "the truth" do assert true end test "customer not valid" do puts "customer not valid" customer = Customer.new assert !customer.valid? assert customer.errors.invalid?(:subdomain) assert_equal "Company Name and Last Name can't both be blank.", customer.errors.on(:contact_lname) end end This is my model: class Customer < ActiveRecord::Base validates_presence_of :subdomain validates_presence_of_at_least_one_field :customer_company_name, :contact_lname, :message => "Company Name and Last Name can't both be blank." has_one :service_plan end When I run the unit test, I get the following error: DEPRECATION WARNING: Rake tasks in vendor/plugins/admin_data/tasks, vendor/plugins/admin_data/tasks, and vendor/plugins/admin_data/tasks are deprecated. Use lib/tasks instead. (called from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/tasks/rails.rb:10) Couldn't drop acs_test : #<ActiveRecord::StatementInvalid: PGError: ERROR: database "acs_test" is being accessed by other users DETAIL: There are 1 other session(s) using the database. : DROP DATABASE IF EXISTS "acs_test"> acs_test already exists NOTICE: CREATE TABLE will create implicit sequence "customers_id_seq" for serial column "customers.id" NOTICE: CREATE TABLE / PRIMARY KEY will create implicit index "customers_pkey" for table "customers" NOTICE: CREATE TABLE will create implicit sequence "service_plans_id_seq" for serial column "service_plans.id" NOTICE: CREATE TABLE / PRIMARY KEY will create implicit index "service_plans_pkey" for table "service_plans" /usr/bin/ruby1.8 -I"lib:test" "/usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb" "test/unit/customer_test.rb" "test/unit/service_plan_test.rb" "test/unit/helpers/dashboard_helper_test.rb" "test/unit/helpers/customers_helper_test.rb" "test/unit/helpers/service_plans_helper_test.rb" /usr/lib/ruby/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:1994:in `method_missing_without_paginate': undefined method `validates_presence_of_at_least_one_field' for #<Class:0xb7076bd0> (NoMethodError) from /usr/lib/ruby/gems/1.8/gems/will_paginate-2.3.12/lib/will_paginate/finder.rb:170:in `method_missing' from /home/george/projects/advancedcomfortcs/app/models/customer.rb:3 from /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require' from /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:31:in `require' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:158:in `require' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:265:in `require_or_load' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:224:in `depend_on' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:136:in `require_dependency' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:414:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:413:in `each' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:413:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:411:in `each' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:411:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:197:in `process' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:113:in `send' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:113:in `run' from /home/george/projects/advancedcomfortcs/config/environment.rb:9 from ./test/test_helper.rb:2:in `require' from ./test/test_helper.rb:2 from ./test/unit/customer_test.rb:1:in `require' from ./test/unit/customer_test.rb:1 from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `load' from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `each' from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 rake aborted! Command failed with status (1): [/usr/bin/ruby1.8 -I"lib:test" "/usr/lib/ru...] (See full trace by running task with --trace) It seems to have stepped on will_paginate somehow. Does anyone have any suggestions? Is there another way to do the validation I'm attempting to do? Thanks, George

    Read the article

  • CREATE mysql database with default InnoDB tables?

    - by memilanuk
    Hello, I've been working on writing a SQL statement to create a MySQL database with several default options, including default character set and default collate. Is it possible to add syntax to make the default engine type for tables in this database to be innodb? I've been looking through the MySQL manual for v.5.1 and I've found the statement 'ENGINE=innodb' which would be appended to a CREATE TABLE statement... but I haven't found anything related to a CREATE DATABASE statement. Is there a normal way to do this as part of the database creation, or does it need to be specified on a table-by-table basis? Thanks, Monte

    Read the article

  • How to retain headers for all the pages of an exported pdf in php?

    - by udaya
    Hi I am exporting data from php page to pdf when the datas exceeed the page limit the header is not available for the consecutive pages function where i call the export to pdf is function changeDetails() { $bType = $this-input-post('textvalue'); if($bType == "pdf") { $this->load->library('table'); $this->load->plugin('to_pdf'); $data['countrytoword'] = $this->AddEditmodel1->export(); $this->table->set_heading('Country','State','Town','Name'); $out = $this->table->generate($data['countrytoword']); $html = $this->load->view( 'newpdf',$data, true); pdf_create($html, $cur_date); } } This is my view page from which i export data to pdf Name Country State Town Here I am getting the result as page:1 Name country State Town udaya india Tamilnadu kovai chandru srilanka columbo aaaaa page:2 vivek england gggkj gjgjkj in the page 2 i dont get the headers name, country ,state and town

    Read the article

  • Creating database desktop application with data manipulation in Netbeans using Java Persistence

    - by Lulu
    It's my first time to use Persistence in developing a Java program because I usually connect via JDBC. I read that for large amounts of data, it is best to use persistence. I tried playing with the CRUD example of Netbeans. It's not very helpful thought because it only connects to the DB and allows addition and deletion of records. I need something that will allow me to manipulate the data like if the value from column C1 of table T1 is such, it will retrieve data from table t2. In short, I need to apply conditions before knowing what to retrieve exactly. The example in CRUD example already has a specific table to retrieve and only acts like a database manager. How is it possible to retrieve a specific item first then from this, will determine the next steps to be done. I'm also using embedded JavaDB/Derby as my database (also my first time to use because I usually use remote mysql)

    Read the article

  • How to implement filter system in SQL?

    - by sadvaw
    Right now I am planning to add a filter system to my site. Examples: (ID=apple, COLOR=red, TASTE=sweet, ORIGIN=US) (ID=mango, COLOR=yellow, TASTE=sweet, ORIGIN=MEXICO) (ID=banana, COLOR=yellow, TASTE=bitter-sweet, ORIGIN=US) so now I am interested in doing the following: SELECT ID FROM thisTable WHERE COLOR='yellow' AND TASTE='SWEET' But my problem is I am doing this for multiple categories in my site, and the columns are NOT consistent. (like if the table is for handphones, then it will be BRAND, 3G-ENABLED, PRICE, COLOR, WAVELENGTH, etc) how could I design a general schema that allows this? Right now I am planning on doing: table(ID, KEY, VALUE) This allows arbitary number of columns, but for the query, I am using SELECT ID FROM table WHERE (KEY=X1 AND VALUE=V1) AND (KEY=X2 AND VALUE=V2), .. which returns an empty set. Can someone recommend a good solution to this? Note that the number of columns WILL change regularly

    Read the article

  • best mysql field setup to have a "couples" member profile data saving.

    - by acctman
    Couple database profile entry, which would be the best way to save data. Also the data will be retrieved via php coding and if it could be down with one query code that would be ideal. Within the site_member table create multiple field for each field... ex: m_firstname1, m_firstname2, m_age1, m_age2, etc... Store the couple member data in one field each... and separate with a comma in the data field ex: m_firstname (Mike, Sherry) Create a separate table site_member_c duplicating the same fields that are in site_member table. This is roughly about 10 fields

    Read the article

  • Computed properties in NHibernate

    - by Liron Levi
    I'm having a problem in mapping an existing data class member to the database.. I have a database table where one of the columns is a string type, but actually stores a comma separated list of numbers. My data class shows this field as a list of integers. The problem is that I couldn't find any hook in NHibernate that allows me to invoke the custom code that is required to replace the string field by the list and vice versa. To illustrate (simplified of course): The database table: CREATE TABLE dummy ( id serial, numlist text -- (can store values such as '1,2,3') ) The data class: class Dummy { public int Id; public List<int> NumbersList; } Can anyone help?

    Read the article

  • Help to the way to write a query for the requirement

    - by Lu Lu
    I need to write a SQL-Server query but I don't know how to solve. I have a table RealtimeData with data: Time | Value 4/29/2009 12:00:00 AM | 3672.0000 4/29/2009 12:01:00 AM | 3645.0000 4/29/2009 12:02:00 AM | 3677.0000 4/29/2009 12:03:00 AM | 3634.0000 4/29/2009 12:04:00 AM | 3676.0000 4/30/2009 12:00:00 AM | 3671.0000 4/30/2009 12:01:00 AM | 3643.0000 4/30/2009 12:02:00 AM | 3672.0000 4/30/2009 12:03:00 AM | 3634.0000 4/30/2009 12:04:00 AM | 3632.0000 4/30/2009 12:05:00 AM | 3672.0000 5/1/2009 12:00:00 AM | 3673.0000 5/1/2009 12:01:00 AM | 3642.0000 5/1/2009 12:02:00 AM | 3672.0000 5/1/2009 12:03:00 AM | 3634.0000 5/1/2009 12:04:00 AM | 3635.0000 I want to get the EOD's data of days which exist in table. (EOD = end of day). With the my sample's data, I will need to reture a table like following: Time | Value 4/29/2009 | 3676.0000 4/30/2009 | 3672.0000 5/1/2009 | 3635.0000 Please help me to solve my problem. Thanks.

    Read the article

  • Please help me to create a insert query (error of foreign key constrant)

    - by Rajesh Rolen- DotNet Developer
    I want to move data from one database's table to another database's table its giving me foreign key error. please tell me how can i insert all those data which is valid except those rows who have error of foreign key. i am using sql server 2005 My query is : SET IDENTITY_INSERT City ON INSERT INTO City ([cityid],[city],[country],[state],[cityinfo] ,[enabled],[countryid],[citycode],[stateid],[latitude],[longitude]) SELECT [cityid],[city],[country],[state],[cityinfo] ,[enabled],[countryid],[citycode],[stateid],[latitude],[longitude] FROM TD.DBo.City getting this error: The INSERT statement conflicted with the FOREIGN KEY constraint "FK__city__countryid__3E52440B". The conflict occurred in database "schoolHigher", table "dbo.country", column 'countryId'. please tell how can i move those data whose foreign key is valid.

    Read the article

  • NHibernate ManyToMany Relationship Cascading AllDeleteOrphan StackOverflowException

    - by Chris
    I have two objects that have a ManyToMany relationship with one another through a mapping table. Though, when I try to save it, I get a stack overflow exception. The following is the code for the mappings: //EventMapping.cs HasManyToMany(x => x.Performers).Table("EventPerformer").Inverse().Cascade.AllDeleteOrphan().LazyLoad().ParentKeyColumn("EventId").ChildKeyColumn("PerformerId"); //PerformerMapping.cs HasManyToMany<Event>(x => x.Events).Table("EventPerformer").Inverse().Cascade.AllDeleteOrphan().LazyLoad().ParentKeyColumn("PerformerId").ChildKeyColumn("EventId"); When I change the performermapping.cs to Cascade.None() I get rid of the exception but then my Event Object doesn't have the performer I associate with it. //In a unit test, paraphrased event.Performers.Add(performer); //Event eventRepository.Save<Event>(event); eventResult = eventRepository.GetById<Event>(event.id); //Event eventResult.Performers[0]; //is null, should have performer in it How should I be writing this properly? Thanks

    Read the article

  • An INSERT conditioned on COUNT

    - by Anders Feder
    How can I construct a MySQL INSERT query that only executes if the number of rows satisfying some condition already in the table is less than 20, and fails otherwise? That is, if the table has 18 rows satisfying the condition, then the INSERT should proceed. If the table has 23 rows satisfying the condition, then the INSERT should fail. For atomicity, I need to express this in a single query, so two requests can not INSERT at the same time, each in the 'belief' that only 19 rows satisfy the condition. Thank you.

    Read the article

  • Entity-attribute-value model using codeigniter / php

    - by John Stewart
    SO I am trying to create a way to structure my database to be able customize forms. I looked into EAV pattern and here is my db structure: Table form - form_id - form_name - form_added_on - form_modified_at Table: form_fields - field_id - form_id - field_type (TEXT, RADIO etc..) - field_default_value - field_required Table: form_data - data_id - field_id - form_id - field_value so now I can store any custom form into the database and if I want to get the values for an individual form I can simply join it by "form_id" .. the problem: I want to be able to search through all the forms for a specific field value. How can I do that with EAV model? Also, I thought about just storing the custom data as a serialized (JSON) object but then I am not sure how can I query that data. Please note that I am using Codeigniter with MYSQL. So if conversation can use Codeigniter libraries if needed.

    Read the article

  • Thoughts/Input about Database Design for a CMS

    - by dallasclark
    I'm just about to expand the functionality of our own CMS but was thinking of restructuring the database to make it simpler to add/edit data types and values. Currently, the CMS is quite flat - the CMS requires a field in the database for every type of stored value (manually created). The first option that comes to mind is simply a table which keeps the data types (ie: Address 1, Suburb, Email Address etc) and another table which holds values for each of these data types. Just like how Wordpress keeps values in the 'options' table, PHP serialize would be used to store an array of values. The second option is how Drupal works, the CMS creates tables for every data type. Unlike Wordpress, this can be a bit of an overkill but really useful for SQL queries when ordering and grouping by a particular value. What's everyone's thoughts?

    Read the article

  • SQL Count Query with Grouping by multiple Rows

    - by Christian
    I have a table with three filled rows named "Name", "City" and "Occupation". I want to create a new row in the same table that contains the number of people who have the same occupation. "Name" | "City" | "Occupation" ------------------------------ Amy | Berlin | Plumber Bob | Berlin | Plumber Carol | Berlin | Lawyer David | London | Plumber I want to have a table that contains: "Name" | "City" | "Occupation" | "Number" --------------------------------------- Amy | Berlin | Plumber | 2 Bob | Berlin | Plumber | 2 Carol | Berlin | Lawyer | 1 David | London | Plumber | 1 How does the SQL Query that creates the new row have to look like? I want to actually create a new row in the database that I can access later.

    Read the article

  • Can i lock a record from a join sql statment using ROWLOCK,UPDLOCK ?

    - by Andrea.Ko
    I have a store procedure to get the data i want: SELECT a.SONum, a.Seq1, a.SptNum, a.Qty1, a.SalUniPriP, a.PayNum, a.InvNum, a.BLNum, c.ETD, c.ShpNum, f.IssBan FROM OrdD a JOIN OrdH b ON a.SONum = b.SONum LEFT JOIN Invh c ON a.InvNum = c.InvNum LEFT JOIN cus d ON b.CusCod = d.CusCod LEFT JOIN BL e ON a.BLNum = e.BLNum LEFT JOIN PayMasH f ON f.PayNum = a.PayNum LEFT JOIN Shipment g ON g.ShpNum = c.ShpNum WHERE b.CusCod IN (SELECT CusCod FROM UsrInc WHERE UseID=@UserID and UseLev=@UserLvl) AND d.CusGrp = @CusGrp After i get those records into cursor, i used RowLock, UpdLock to lock all the related invoice number. SELECT InvNum FROM Invh WITH (ROWLOCK,UPDLOCK) WHERE InvNum = Can i issue locking on the table INVH at the point i select the table from a few table using join command? Any advice, please!

    Read the article

  • Protection against CheatEngine and other injectors [duplicate]

    - by Lucas
    This question already has an answer here: Strategies to Defeat Memory Editors for Cheating - Desktop Games 10 answers Is protection against CheatEngine and other inject tools are possible to do? I was thinking a day and the only one idea I've got is about writting some small application which will scan the processes running every second, and in case if any injector will be found the game client will exit immadiately. I'm writing here to see your opinions on this case as some of you may have some expierence against protecting the game clients against DLL or PYC injection or something.

    Read the article

  • Extremely slow insert from Delphi to Remote MySQL Database

    - by MarkRobinson
    Having a major hair-pulling issue with extremely slow inserts from Delphi 2010 to a remote MySQL 5.09 server. So far, I have tried: ADO using MySQL ODBC Driver Zeoslib v7 Alpha I have used batching and direct insert with ADO (using table access), and with Zeos I have used SQL insertion with a Query, then used Table direct mode and also cached updates Table mode using applyupdates and commit. Both technologies I have tried with compression on and off. So far I have seen a pretty much the same across the board 7.5 records per second!!! Now, I would from this point assume that the remote server is just slow, but the MySQL Workbench is amazingly fast, and the Migration toolkit managed the initial migration very quickly (to be honest, I don't recall how quickly - which kind of means that it was quick) I'm just about to try the MyDAC components as we already use SDAC (wish there was a multi-buy discount or that we'd chosen UniDAC instead now!) Any ideas?

    Read the article

  • Separating weakly linked database schemas

    - by jldugger
    I've been tasked with revisiting a database schema we designed and use internally for various ticketing and reporting systems. Currently there exists about 40 tables in one Oracle database schema supporting perhaps six webapps. However, there's one unifying relationship amongst them all: a rooms table describing the room. Room name, purpose and other data are thrown into a shared table for each app. My initial idea was to pull each of these applications into a separate database, and perform joins between a given database and the room database. But I've discovered this solution prevents foreign key constraints in SQL Server 2005. It seems silly to duplicate one table for each app and keep those multiple copies synchronized. Should I just leave everything in one large DB, or is there something else I can do separate the tables without losing FK constraints?

    Read the article

  • Inserting non-pod struct into a GHashTable

    - by RikSaunderson
    Hi there, I'm trying to build a GHashTable of instances of a struct containing ints, a time_t and a few char*'s. My question is, how do you insert an instance of a struct into a GHashTable? there are plenty of examples of how to insert a string or an int (using g_str_hash and g_int_hash respectively), but I'm guessing thatI want to use the g_direct_hash, and I can't seem to find any examples of that. Ideally, my code would look like this: GHashtable table; table = g_hash_table_new(g_direct_hash, g_direct_equal); struct mystruct; mystruct.a = 1; mystruct.b = "hello"; mystruct.c = 5; mystruct.d = "test"; g_hash_table_insert(table,mystruct.a,mystruct); Clearly, this is incorrect as it does not compile. Can anyone provide an example that does do what I want? Thanks, Rik

    Read the article

  • CakePHP Bake association problem

    - by Apu
    I have only two tables in my database with a one-to-many relationship between them (user hasMany messages) and am trying to get basic CRUD functionality going. Bake detects the associations correctly and specifies them correctly inside the model classes, but in controllers and views it looks like Cake doesn't know anything about those associations -- I don't even get a select tag for user_id when I go add a new message. Has anyone come across this problem before? What can I be doing wrong? Table structure appears to be fine: CREATE TABLE users ( id int(11) NOT NULL AUTO_INCREMENT, username varchar(255) NOT NULL, `password` varchar(255) NOT NULL, email varchar(255) NOT NULL, created datetime NOT NULL, modified datetime NOT NULL, PRIMARY KEY (id) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; CREATE TABLE IF NOT EXISTS `messages` ( `id` int(11) NOT NULL AUTO_INCREMENT, `user_id` int(11) NOT NULL, `content` varchar(255) NOT NULL, `created` datetime NOT NULL, `modified` datetime NOT NULL, PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ;

    Read the article

  • From the Tips Box: Telescope Laser Sights, Drobox Desktops, and Kindle Clipping Conversions

    - by Jason Fitzpatrick
    Once a week we round up some great reader tips and share them with everyone; this week we’re looking at telescope laser sights, syncing your desktop with Dropbox, and converting your Kindle Clippings file. How To Properly Scan a Photograph (And Get An Even Better Image) The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage

    Read the article

  • Wireless Broadcom 4313 not working on Ubuntu 12.04

    - by user88568
    It seems a lot of people are having this problem, but none of the posted solutions have worked for me so far. My driver is installed and activated, I have tried removing and re-adding the network, and various other fixes. No networks were picked up on my first boot, the next day wireless worked fine, and since then it does not detect networks, and when I manually try to connect, it repeatedly asks for the password and does not connect. Here's my info: 03:00.0 Network controller: Broadcom Corporation BCM4313 802.11b/g/n Wireless LAN Controller (rev 01) Subsystem: Dell Inspiron M5010 / XPS 8300 Flags: bus master, fast devsel, latency 0, IRQ 17 Memory at f0500000 (64-bit, non-prefetchable) [size=16K] Capabilities: [40] Power Management version 3 Capabilities: [58] Vendor Specific Information: Len=78 Capabilities: [48] MSI: Enable- Count=1/1 Maskable- 64bit+ Capabilities: [d0] Express Endpoint, MSI 00 Capabilities: [100] Advanced Error Reporting Capabilities: [13c] Virtual Channel Capabilities: [160] Device Serial Number 00-00-a1-ff-ff-f3-70-f1 Capabilities: [16c] Power Budgeting Kernel driver in use: wl Kernel modules: wl, bcma, brcmsmac root@michelle-laptop:/home/michelle# ifconfig eth0 Link encap:Ethernet HWaddr 70:f1:a1:f3:ba:ab inet6 addr: fe80::72f1:a1ff:fef3:baab/64 Scope:Link UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:29 TX packets:0 errors:30 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:17 eth1 Link encap:Ethernet HWaddr f0:4d:a2:53:83:7a UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:43 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:104 errors:0 dropped:0 overruns:0 frame:0 TX packets:104 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:8000 (8.0 KB) TX bytes:8000 (8.0 KB) root@michelle-laptop:/home/michelle# lsmod Module Size Used by dm_crypt 22528 0 snd_hda_codec_hdmi 31775 1 snd_hda_codec_realtek 174313 1 snd_hda_intel 32765 5 snd_hda_codec 109562 3 snd_hda_codec_hdmi,snd_hda_codec_realtek,snd_hda_intel snd_hwdep 13276 1 snd_hda_codec snd_pcm 80845 4 snd_hda_codec_hdmi,snd_hda_intel,snd_hda_codec snd_seq_midi 13132 0 snd_rawmidi 25424 1 snd_seq_midi snd_seq_midi_event 14475 1 snd_seq_midi snd_seq 51567 2 snd_seq_midi,snd_seq_midi_event parport_pc 32114 0 ppdev 12849 0 binfmt_misc 17292 1 lib80211_crypt_tkip 17275 0 bnep 17830 2 rfcomm 38139 0 snd_timer 28931 2 snd_pcm,snd_seq snd_seq_device 14172 3 snd_seq_midi,snd_rawmidi,snd_seq joydev 17393 0 wl 2646601 0 snd 62064 19 snd_hda_codec_hdmi,snd_hda_codec_realtek,snd_hda_intel,snd_hda_codec,snd_hwdep,snd_pcm,snd_rawmidi,snd_seq,snd_timer,snd_seq_device soundcore 14635 1 snd btusb 17912 0 bluetooth 158438 11 bnep,rfcomm,btusb uvcvideo 67203 0 videodev 86588 1 uvcvideo snd_page_alloc 14108 2 snd_hda_intel,snd_pcm lib80211 14040 2 lib80211_crypt_tkip,wl intel_ips 17822 0 psmouse 72919 0 serio_raw 13027 0 mei 36570 0 dell_laptop 17767 0 dell_wmi 12601 0 dcdbas 14098 1 dell_laptop sparse_keymap 13658 1 dell_wmi mac_hid 13077 0 lp 17455 0 parport 40930 3 parport_pc,ppdev,lp usbhid 41906 0 hid 77367 1 usbhid wmi 18744 1 dell_wmi i915 414817 3 atl1c 36718 0 drm_kms_helper 45466 1 i915 drm 197692 4 i915,drm_kms_helper i2c_algo_bit 13199 1 i915 video 19068 1 i915 root@michelle-laptop:/home/michelle# iwlist scan lo Interface doesn't support scanning. eth1 Interface doesn't support scanning. eth0 No scan results root@michelle-laptop:/home/michelle# rfkill list 0: dell-wifi: Wireless LAN Soft blocked: no Hard blocked: no 1: dell-bluetooth: Bluetooth Soft blocked: yes Hard blocked: no 3: brcmwl-0: Wireless LAN Soft blocked: no Hard blocked: no Obviously I don't know what I'm doing. I'd appreciate any help! Thanks!

    Read the article

< Previous Page | 395 396 397 398 399 400 401 402 403 404 405 406  | Next Page >