Search Results

Search found 52418 results on 2097 pages for 'free database security ev'.

Page 108/2097 | < Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >

  • MySQL Database is Indexed at Apache Solr, How to access it via URL

    - by Wasim
    data-config.xml <dataConfig> <dataSource encoding="UTF-8" type="JdbcDataSource" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/somevisits" user="root" password=""/> <document name="somevisits"> <entity name="login" query="select * from login"> <field column="sv_id" name="sv_id" /> <field column="sv_username" name="sv_username" /> </entity> </document> </dataConfig> schema.xml <?xml version="1.0" encoding="UTF-8" ?> <schema name="example" version="1.5"> <fields> <field name="sv_id" type="string" indexed="true" stored="true" required="true" multiValued="false" /> <field name="username" type="string" indexed="true" stored="true" required="true"/> <field name="_version_" type="long" indexed="true" stored="true" multiValued="false"/> <field name="text" type="string" indexed="true" stored="false" multiValued="true"/> </fields> <uniqueKey>sv_id</uniqueKey> <types> <fieldType name="string" class="solr.StrField" sortMissingLast="true" /> <fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/> </types> </schema> Solr successfully imported mysql database using full http://[localSolr]:8983/solr/#/collection1/dataimport?command=full-import My question is, how to access that mysql imported database now?

    Read the article

  • Creating android app Database with big amount of data

    - by Thomas
    Hi all, The database of my application need to be filled with a lot of data, so during onCreate(), it's not only some create table sql instructions, there is a lot of inserts. The solution I chose is to store all this instructions in a sql file located in res/raw and which is loaded with Resources.openRawResource(id). It works well but I face to encoding issue, I have some accentuated caharacters in the sql file which appears bad in my application. This my code to do this : public String getFileContent(Resources resources, int rawId) throws IOException { InputStream is = resources.openRawResource(rawId); int size = is.available(); // Read the entire asset into a local byte buffer. byte[] buffer = new byte[size]; is.read(buffer); is.close(); // Convert the buffer into a string. return new String(buffer); } public void onCreate(SQLiteDatabase db) { try { // get file content String sqlCode = getFileContent(mCtx.getResources(), R.raw.db_create); // execute code for (String sqlStatements : sqlCode.split(";")) { db.execSQL(sqlStatements); } Log.v("Creating database done."); } catch (IOException e) { // Should never happen! Log.e("Error reading sql file " + e.getMessage(), e); throw new RuntimeException(e); } catch (SQLException e) { Log.e("Error executing sql code " + e.getMessage(), e); throw new RuntimeException(e); } The solution I found to avoid this is to load the sql instructions from a huge static final string instead of a file, and all accentutated characters appears well. But Isn't there a more elegant way to load sql instructions than a big static final String attribute with all sql instructions ? Thanks in advance Thomas

    Read the article

  • Page URL and database organization.

    - by shurik2533
    I want that its name would be the page address. For example, if page has heading "Some Page", than its address should be http://somesite/some_page/. "some_page"-name generated by system automatically. "some_page" - is the unique identifier of page. The problem in that the user in the future can enter a name which already exists that will cause an error. It is necessary to find an optimum variant of the decision of a problem for great volumes of the data. I have solved a problem as follows: The page identifier in a database is the name of page and a suffix which is by default equal to zero. At page addition there is a check on existence. If such page does not exist, the suffix is equal 0 and its name is "some_page", if page is exist, than - search for the maximum number of a suffix and suffix=suffix+1 and page name become "some_page_1". For this I create in a database the compound key from fields "suffix" and "pageName": Table Pages suffix|pageName |pageTitle 0 |some_page |Some Page 1 |some_page |Some Page 0 |other_page|Other Page Addition of pages occurs through stored procedure: CREATE PROCEDURE addPage (pageNameVal VARCHAR(100), pageTitleVal VARCHAR(100)) BEGIN DECLARE v INT DEFAULT 0; SELECT MAX(suffix) FROM pages WHERE pageName=pageNameVal INTO v; IF v >= 0 THEN SET v = v + 1; ELSE SET v = 0; END IF; INSERT INTO pages (suffix, pageName) VALUES (pageNameVal, v, pageTitleVal); END; Whether there are more the best decisions?

    Read the article

  • MYSQL - SImple database design

    - by sequelDesigner
    Hello guys, I would like to develop a system, where user will get the data dynamically(what I mean dynamic is, without reloading pages, using AJAX.. but well, it does not matter much). My situation is like this. I have this table, I called it "player", in this player table, I will store the player information like, player name, level, experience etc. Each player can have different clothes, start from tops(shirts), bottoms, shoes, and hairstyle, and each player can have more than 1 tops, bottoms, shoes etc. What I am hesitated or not very sure about is, how do you normally store the data? My current design is like this: Player Table =========================================================================================== id | name | (others player's info) | wearing | tops | bottoms =========================================================================================== 1 | player1 | | top=1;bottom=2;shoes=5;hair=8 | 1,2,3| 7,2,3 Tops Table ===================== id | name | etc... ===================== 1 | t-shirt | ... I am not sure if this design is good. If you are the database designer, how would you design the database? Or how you will store them? Please advise. Thanks

    Read the article

  • insert multiple rows into database from arrays

    - by Mark
    Hi, i need some help with inserting multiple rows from different arrays into my database. I am making the database for a seating plan, for each seating block there is 5 rows (A-E) with each row having 15 seats. my DB rows are seat_id, seat_block, seat_row, seat_number, therefore i need to add 15 seat_numbers for each seat_row and 5 seat_rows for each seat_block. I mocked it up with some foreach loops but need some help turning it into an (hopefully single) SQL statement. $blocks = array("A","B","C","D"); $seat_rows = array("A","B","C","D","E"); $seat_nums = array("1","2","3","4","5","6","7","8","9","10","11","12","13","14","15"); foreach($blocks as $block){ echo "<br><br>"; echo "Block: " . $block . " - "; foreach($seat_rows as $rows){ echo "Row: " . $rows . ", "; foreach($seat_nums as $seats){ echo "seat:" . $seats . " "; } } } Maybe there's a better way of doing it instead of using arrays? i just want to avoid writing an SQL statement that is over 100 lines long ;) (im using codeigniter too if anyone knows of a CI specific way of doing it but im not too bothered about that)

    Read the article

  • The best way to structure this database?

    - by James P
    At the moment I'm doing this: gems(id, name, colour, level, effects, source) id is the primary key and is not auto-increment. A typical row of data would look like this: id => 40153 name => Veiled Ametrine colour => Orange level => 80 effects => +12 sp, +10 hit source => Ametrine (Some of you gamers might see what I'm doing here :) ) But I realise this could be sorted a lot better. I have studied database relationships and secondary keys in my A-Level computing class but never got as far as to set one up properly. I just need help with how this database should be organised, like what tables should have what data with what secondary and foreign keys? I was thinking maybe 3 tables: gem, effects, source. Which then have relationships to each other? Can anyone shed some light on this? Is a complex way like I'm proposing really the way to go or should I just carry on with what I'm doing? Cheers.

    Read the article

  • How to handle product ratings in a database

    - by Mel
    Hello, I would like to know what is the best approach to storing product ratings in a database. I have in mind the following two (simplified, and assuming a MySQL db) scenarios: Scenario 1: Create two columns in the product table to store number of votes and the sum of all votes. Use columns to get an average on the product display page: products(productedID, productName, voteCount, voteSum) Pros: I will only need to access one table, and thus execute one query to display product data and ratings. Cons: Write operations will be executed in a table whose original purpose is only to furnish product data. Scenario 2: Create an additional table to store ratings. products(productID, productName) ratings(productID, voteCount, voteSum) Pros: Isolate ratings into a separate table, leaving the products table to furnish data on available products. Cons: I will have to execute two separate queries on product page requests (one for data and another for ratings). In terms of performance, which of the following two approaches is best: Allow users to execute an occasional write query to a table that will handle hundreds of read requests? Execute two queries at every product page, but isolate the write query into a separate table. I'm a novice to database development, and often find myself struggling with simple questions such as these. Many thanks,

    Read the article

  • Is it possible for double-escaping to cause harm to the DB?

    - by waiwai933
    If I accidentally double escape a string, can the DB be harmed? For the purposes of this question, let's say I'm not using parametrized queries For example, let's say I get the following input: bob's bike And I escape that: bob\'s bike But my code is horrible, and escapes it again: bob\\\'s bike Now, if I insert that into a DB, the value in the DB will be bob\'s bike Which, while is not what I want, won't harm the DB. Is it possible for any input that's double escaped to do something malicious to the DB assuming that I take all other necessary security precautions?

    Read the article

  • Unaccounted for database size

    - by Nazadus
    I currently have a database that is 20GB in size. I've run a few scripts which show on each tables size (and other incredibly useful information such as index stuff) and the biggest table is 1.1 million records which takes up 150MB of data. We have less than 50 tables most of which take up less than 1MB of data. After looking at the size of each table I don't understand why the database shouldn't be 1GB in size after a shrink. The amount of available free space that SqlServer (2005) reports is 0%. The log mode is set to simple. At this point my main concern is I feel like I have 19GB of unaccounted for used space. Is there something else I should look at? Normally I wouldn't care and would make this a passive research project except this particular situation calls for us to do a backup and restore on a weekly basis to put a copy on a satellite (which has no internet, so it must be done manually). I'd much rather copy 1GB (or even if it were down to 5GB!) than 20GB of data each week. sp_spaceused reports the following: Navigator-Production 19184.56 MB 3.02 MB And the second part of it: 19640872 KB 19512112 KB 108184 KB 20576 KB while I've found a few other scripts (such as the one from two of the server database size questions here, they all report the same information either found above or below). The script I am using is from SqlTeam. Here is the header info: * BigTables.sql * Bill Graziano (SQLTeam.com) * graz@<email removed> * v1.11 The top few tables show this (table, rows, reserved space, data, index, unused, etc): Activity 1143639 131 MB 89 MB 41768 KB 1648 KB 46% 1% EventAttendance 883261 90 MB 58 MB 32264 KB 328 KB 54% 0% Person 113437 31 MB 15 MB 15752 KB 912 KB 103% 3% HouseholdMember 113443 12 MB 6 MB 5224 KB 432 KB 82% 4% PostalAddress 48870 8 MB 6 MB 2200 KB 280 KB 36% 3% The rest of the tables are either the same in size or smaller. No more than 50 tables. Update 1: - All tables use unique identifiers. Usually an int incremented by 1 per row. I've also re-indexed everything. I ran the dbcc shrink command as well as updating the usage before and after. And over and over. An interesting thing I found is that when I restarted the server and confirmed no one was using it (and no maintenance procs are running, this is a very new application -- under a week old) and when I went to run the shrink, every now and then it would say something about data changed. Googling yielded too few useful answers with the obvious not applying (it was 1am and I disconnected everyone, so it seems impossible that was really the case). The data was migrated via C# code which basically looked at another server and brought things over. The quantity of deletes, at this point in time, are probably under 50k in rows. Even if those rows were the biggest rows, that wouldn't be more than 100M I would imagine. When I go to shrink via the GUI it reports 0% available to shrink, indicating that I've already gotten it as small as it thinks it can go. Update 2: sp_spaceused 'Activity' yields this (which seems right on the money): Activity 1143639 134488 KB 91072 KB 41768 KB 1648 KB Fill factor was 90. All primary keys are ints. Here is the command I used to 'updateusage': DBCC UPDATEUSAGE(0); Update 3: Per Edosoft's request: Image 111975 2407773 19262184 It appears as though the image table believes it's the 19GB portion. I don't understand what this means though. Is it really 19GB or is it misrepresented? Update 4: Talking to a co-worker and I found out that it's because of the pages, as someone else here has also state the potential for that. The only index on the image table is a clustered PK. Is this something I can fix or do I just have to deal with it? The regular script shows the Image table to be 6MB in size. Update 5: I think I'm just going to have to deal with it after further research. The images have been resized to be roughly 2-5KB each and on a normal file system doesn't consume much space but on SqlServer it seems to consume considerably more. The real answer, in the long run, will likely be separating that table in to another partition or something similar.

    Read the article

  • What is the sense of permiting the user to use no passwords longer than xx chars?

    - by reox
    Its more like a usability question or maybe database, or even maybe security (consider injection attacks) but what is the sense of permiting the user's password to a be not longer than xx chars? It does not make any sense to me, because longer passwords are mostly considered better and even harder to crack, and some users use password safes, so the password length should not matter. I understand that passwords with more than 20 chars are hardly to remember, but if you use diceware or password safe you dont have any problem with that. I really cant understand why there are sites that say "your password need to be between 5 and 8 chars"... also should the password saved as hash, so the length of the field in the database is fixed, so where is the problem? i think that most of the sites where the password is has to be a fixed length are not even using any hashing method...

    Read the article

  • Why does Spring Security Core RC4 require Grails 2.3?

    - by bksaville
    On the Spring Security Core plugin GitHub repo, I see that Graeme on May 21st upped the required version of Grails from 2.0 to 2.3 before the RC4 version was released a couple of months later, but I don't see any explanation for why. Was it mismatched dependencies, bug reports, etc? I run a 2.2.4 app, and I would prefer not to upgrade at this point just to get the latest RC of spring security core. I understand if the upgrade to 3.2.0.RELEASE of spring security caused mismatched dependencies with older versions of Grails since I've run into the same issues before. This originally came up due to a pull request on the spring security OAuth2 provider plugin that I maintain. The pull request upped the required version to 2.3, and the requester pointed me to the RC4 release of core as the reason. Thanks for the good works as always!

    Read the article

  • How to set up WebLogic 10.3.3. security for JAX_WS web services?

    - by Roman Kagan
    I have quite simple task to accomplish - I have to set up the security for web services ( basic authentication with hardcoded in WLES user id and password). I set the web.xml (see code fragment below) but I have tough time configuring WebLogic. I added IdentityAssertionAuthenticator Authentication Provider, set it as Required, modified DefaultAuthenticator as Optional and I went to deployed application's security and set the role to "thisIsUser" and at some point it worked, but not anymore (I redeployed war file and set web service security the same way but no avail.) I'd greatly appreciate for all your help. <security-constraint> <display-name>SecurityConstraint</display-name> <web-resource-collection> <web-resource-name>ABC</web-resource-name> <url-pattern>/ABC</url-pattern> </web-resource-collection> <auth-constraint> <role-name>thisIsUser</role-name> </auth-constraint> </security-constraint>

    Read the article

  • password limitations in SQL Server and MySql

    - by asteroid
    Does MySql 5.1 and SQL Server 2008 (Web edition, Standard) have any functional password limitations other than length limits? Are metacharacters in any form a bad idea to use, like bang, pipe, hash, any slash, carrot, and so on? I know that MySql 5.1 has a password length limitation of 16 characters that is hardcoded, but I was wondering, are any metacharacters (i.e. non alphanumerics) a bad idea to use? And is this true in SQL Server 2008 Web edition, Standard? So specifically: can symbols like: /`~:}{[]^ be used successfully? I would hope it doesn't matter to the database, but I don't understand enough about password storage in enterprise database systems yet to know for sure, and I was looking for confirmation or an explanation.

    Read the article

  • How can i add Active Directory security groups to a SharePoint site to control permissions, rather than individual user accounts

    - by user574811
    SharePoint does integrate active directory accounts, of course, but how about security groups? Have a few sites where I'm fairly confident access is going through an existing Active Directory (AD) security groups (i.e. only an AD security group has been granted permissions through the 'People and Groups') In another situation, where I created the AD group and granted it permissions to a site, the customers were not able to access immediately. Eventually had to fast-track it and add the individuals to the People and Groups to keep the project going, but hoping not to have to maintain it that way. Any specific requirements of the security group in AD? Universal, Global, or domain local? Is there any time delay between modifying group members in AD and having that take effect in SharePoint?

    Read the article

  • ubuntu/apt-get update said "Failed to Fetch http:// .... 404 not found"

    - by lindenb
    Hi all, I'm trying to run apt-get update on ubuntu 9.10 I've configured my proxy server and I can access the internet without any problem: /etc/apt# wget "http://www.google.com" Resolving (...) Proxy request sent, awaiting response... 200 OK Length: 292 [text/html] Saving to: `index.html' 100%[=================================================================================================================================>] 292 --.-K/s in 0s 2010-04-02 17:20:33 (29.8 MB/s) - `index.html' saved [292/292] But when I tried to use apt-get I got the following message: Ign http://archive.ubuntu.com karmic Release.gpg Ign http://ubuntu.univ-nantes.fr karmic Release.gpg Ign http://ubuntu.univ-nantes.fr karmic/main Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic/restricted Translation-en_US Ign http://archive.ubuntu.com karmic Release Ign http://ubuntu.univ-nantes.fr karmic/multiverse Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic/universe Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic-updates Release.gpg Ign http://archive.ubuntu.com karmic/main Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/main Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic-updates/restricted Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic-updates/multiverse Translation-en_US Ign http://archive.ubuntu.com karmic/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/universe Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic-security Release.gpg Ign http://archive.ubuntu.com karmic/main Sources Ign http://ubuntu.univ-nantes.fr karmic-security/main Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic-security/restricted Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic-security/multiverse Translation-en_US Ign http://archive.ubuntu.com karmic/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic-security/universe Translation-en_US Ign http://ubuntu.univ-nantes.fr karmic Release Err http://archive.ubuntu.com karmic/main Sources 404 Not Found Ign http://ubuntu.univ-nantes.fr karmic-updates Release Ign http://ubuntu.univ-nantes.fr karmic-security Release Err http://archive.ubuntu.com karmic/restricted Sources 404 Not Found Ign http://ubuntu.univ-nantes.fr karmic/main Packages Ign http://ubuntu.univ-nantes.fr karmic/restricted Packages Ign http://ubuntu.univ-nantes.fr karmic/multiverse Packages Ign http://ubuntu.univ-nantes.fr karmic/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic/main Sources Ign http://ubuntu.univ-nantes.fr karmic/universe Sources Ign http://ubuntu.univ-nantes.fr karmic/universe Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/main Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/restricted Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/multiverse Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/main Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/universe Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/universe Packages Ign http://ubuntu.univ-nantes.fr karmic-security/main Packages Ign http://ubuntu.univ-nantes.fr karmic-security/restricted Packages Ign http://ubuntu.univ-nantes.fr karmic-security/multiverse Packages Ign http://ubuntu.univ-nantes.fr karmic-security/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic-security/main Sources Ign http://ubuntu.univ-nantes.fr karmic-security/universe Sources Ign http://ubuntu.univ-nantes.fr karmic-security/universe Packages Ign http://ubuntu.univ-nantes.fr karmic/main Packages Ign http://ubuntu.univ-nantes.fr karmic/restricted Packages Ign http://ubuntu.univ-nantes.fr karmic/multiverse Packages Ign http://ubuntu.univ-nantes.fr karmic/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic/main Sources Ign http://ubuntu.univ-nantes.fr karmic/universe Sources Ign http://ubuntu.univ-nantes.fr karmic/universe Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/main Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/restricted Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/multiverse Packages Ign http://ubuntu.univ-nantes.fr karmic-updates/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/main Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/universe Sources Ign http://ubuntu.univ-nantes.fr karmic-updates/universe Packages Ign http://ubuntu.univ-nantes.fr karmic-security/main Packages Ign http://ubuntu.univ-nantes.fr karmic-security/restricted Packages Ign http://ubuntu.univ-nantes.fr karmic-security/multiverse Packages Ign http://ubuntu.univ-nantes.fr karmic-security/restricted Sources Ign http://ubuntu.univ-nantes.fr karmic-security/main Sources Ign http://ubuntu.univ-nantes.fr karmic-security/universe Sources Ign http://ubuntu.univ-nantes.fr karmic-security/universe Packages Err http://ubuntu.univ-nantes.fr karmic/main Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic/restricted Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic/multiverse Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic/restricted Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic/main Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic/universe Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic/universe Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/main Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/restricted Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/multiverse Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/restricted Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/main Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/universe Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-updates/universe Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/main Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/restricted Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/multiverse Packages 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/restricted Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/main Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/universe Sources 404 Not Found Err http://ubuntu.univ-nantes.fr karmic-security/universe Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/karmic/main/source/Sources.gz 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/karmic/restricted/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/main/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/restricted/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/multiverse/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/restricted/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/main/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/universe/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic/universe/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/main/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/restricted/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/multiverse/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/restricted/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/main/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/universe/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-updates/universe/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/main/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/restricted/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/multiverse/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/restricted/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/main/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/universe/source/Sources.gz 404 Not Found W: Failed to fetch http://ubuntu.univ-nantes.fr/ubuntu/dists/karmic-security/universe/binary-i386/Packages.gz 404 Not Found apt.conf However I can 'see' those files with firefox. more /etc/apt/apt.conf Acquire::http::proxy "http://www.myproxyname.fr:3128"; I also tried with port '80', or with a blank /etc/apt/apt.conf source.list grep -v "#" /etc/apt/sources.list deb http://ubuntu.univ-nantes.fr/ubuntu/ karmic main restricted multiverse deb http://ubuntu.univ-nantes.fr/ubuntu/ karmic-updates main restricted multiverse deb http://ubuntu.univ-nantes.fr/ubuntu/ karmic universe deb http://ubuntu.univ-nantes.fr/ubuntu/ karmic-updates universe deb http://ubuntu.univ-nantes.fr/ubuntu/ karmic-security main restricted multiverse deb http://ubuntu.univ-nantes.fr/ubuntu/ karmic-security universe does anyone knows how to fix this ? Thanks Pierre

    Read the article

  • Problem Disabling Roaming Profiles on Grouped Users

    - by user43207
    I'm having some serious issues getting a group of users to stop using roaming profiles. As expected, I have roaming profiles enabled accross the domain. - But am doing GPO filtering, limiting the scope. I originally had it set to authenticated users for Roaming, but as the domain has branched out to multiple locations, I've limited the scope to only people that are near the central office. The GPO that I have linked filtered to a group I have created that include users that I don't want to have roaming profiles. This GPO is sitting at the root of the domain, with the "Forced" setting enabled, so it should override any setting below it. *On a side note, it is the ONLY GPO that I have set to "Forced" right now. I know the GPO is working, since I can see the original registy settings on a user that logged in under roaming profiles - and then that same user logging in after I made the Group Policy changes, the registry reflects a local profile. But unfortunately, even after making those settings - the user is given a roaming profile on one of the servers. A gpresult of that same user account (after the updated gpo) is listed in the code block below. You can see right at the top of that output, that it is infact dealing with a roaming profile. - And sure enough, on the server that's hosting the file share for roaming profiles, it creates a folder for the user once they log in. For testing purposes, I've deleted all copies of the user's profile, roaming and local. But the problem is still here. - So I'm aparently missing something in the group policy settings on a wider scale. Would anybody be able to point me in the direction of what I'm missing here? *gpresult /r*** Microsoft (R) Windows (R) Operating System Group Policy Result tool v2.0 Copyright (C) Microsoft Corp. 1981-2001 Created On 5/15/2010 at 8:59:00 AM RSOP data for ** on * : Logging Mode OS Configuration: Member Workstation OS Version: 6.1.7600 Site Name: N/A Roaming Profile: \\profiles$** Local Profile: C:\Users*** Connected over a slow link?: No USER SETTINGS CN=*****,OU=*****,OU=*****,OU=*****,DC=*****,DC=***** Last time Group Policy was applied: 5/15/2010 at 8:52:02 AM Group Policy was applied from: *****.*****.com Group Policy slow link threshold: 500 kbps Domain Name: USSLINDSTROM Domain Type: Windows 2000 Applied Group Policy Objects ----------------------------- ForceLocalProfilesOnly InternetExplorer_***** GlobalPasswordPolicy The following GPOs were not applied because they were filtered out ------------------------------------------------------------------- DAgentFirewallExceptions Filtering: Denied (Security) WSAdmin_***** Filtering: Denied (Security) NetlogonFirewallExceptions Filtering: Not Applied (Empty) NetLogon_***** Filtering: Denied (Security) WSUSUpdateScheduleManualInstall Filtering: Denied (Security) WSUSUpdateScheduleDaily_0300 Filtering: Denied (Security) WSUSUpdateScheduleThu_0100 Filtering: Denied (Security) AlternateSSLFirewallExceptions Filtering: Denied (Security) SNMPFirewallExceptions Filtering: Denied (Security) WSUSUpdateScheduleSun_0100 Filtering: Denied (Security) SQLServerFirewallExceptions Filtering: Denied (Security) WSUSUpdateScheduleTue_0100 Filtering: Denied (Security) WSUSUpdateScheduleSat_0100 Filtering: Denied (Security) DisableUAC Filtering: Denied (Security) ICMPFirewallExceptions Filtering: Denied (Security) AdminShareFirewallExceptions Filtering: Denied (Security) GPRefreshInterval Filtering: Denied (Security) ServeRAIDFirewallExceptions Filtering: Denied (Security) WSUSUpdateScheduleFri_0100 Filtering: Denied (Security) BlockFirewallExceptions(8400-8410) Filtering: Denied (Security) WSUSUpdateScheduleWed_0100 Filtering: Denied (Security) Local Group Policy Filtering: Not Applied (Empty) WSUS_***** Filtering: Denied (Security) LogonAsService_Idaho Filtering: Denied (Security) ReportServerFirewallExceptions Filtering: Denied (Security) WSUSUpdateScheduleMon_0100 Filtering: Denied (Security) TFSFirewallExceptions Filtering: Denied (Security) Default Domain Policy Filtering: Not Applied (Empty) DenyServerSideRoamingProfiles Filtering: Denied (Security) ShareConnectionsRemainAlive Filtering: Denied (Security) The user is a part of the following security groups --------------------------------------------------- Domain Users Everyone BUILTIN\Users BUILTIN\Administrators NT AUTHORITY\INTERACTIVE CONSOLE LOGON NT AUTHORITY\Authenticated Users This Organization LOCAL *****Users VPNAccess_***** NetAdmin_***** SiteAdmin_***** WSAdmin_***** VPNAccess_***** LocalProfileOnly_***** NetworkAdmin_***** LocalProfileOnly_***** VPNAccess_***** NetAdmin_***** Domain Admins WSAdmin_***** WSAdmin_***** ***** ***** Schema Admins ***** Enterprise Admins Denied RODC Password Replication Group High Mandatory Level

    Read the article

  • SQL Server 2005 jobs running twice in a row - using LiteSpeed

    - by Malnizzle
    Howdy! I have a SQL server (2005) backing up to a network share, who has a group of maintenance plans setup through LiteSpeed to backup different DBs. They were just set up to run two sub plans on different schedules for full/diff backups and did that just fine for a couple of months. Then I added "Clean Up" task to the subplans. Ever since that point, the backup creates another bak right after the first bak job is completed. I removed the clean up item from the subplan, and it still creates two baks when ran. Both the SQL Activity Monitor and the machine's windows application log show just one job being executed. I did this same thing to a couple of other servers backing up to the same location, and they are behaving correctly. Thoughts?

    Read the article

  • How to warehouse data that is not needed from MS SQL server

    - by I__
    I have been asked to truncate a large table in MS SQL Server 2008. The data is not needed but might be needed once every two years. It will NEVER have to be changed, only viewed. The question is, since I don't need the data on a day-to-day basis, what do I do with it to protect and back it up? Please keep in mind that I will need to have it accessible maybe once every two years, and it is FINE for us if the recovery process takes a few hours. The entire table is about 3 million rows and I need to truncate it to about 1 million rows.

    Read the article

  • PostgreSQL lots of large Arrays and Writes

    - by strife911
    Hi, I am running a python program that spawns 8 threads and as each thread launch its own postmaster process via psycopg2. This is to maximize the use of my CPU-cores (8). Each thread call a series of SQL Functions. Most of these functions go through many thousands of rows each associated to a large FLOAT8[] Array (250-300) values by using unnest() and multiplying each FLOAT8 by an another FLOAT8 associated to each row. This Array approach minimized the size of the Indexes and the Tables. The Function ends with an Insert into another Table of a row of the same form (pk INT4, array FLOAT8[]). Some SQL Functions called by python will Update a row of these kind of Tables (with large Arrays). Now I currently have configured PostgreSQL to use most of the memory for cache (effective_cache_size of 57 GB I think) and only a small amount of it for shared memory (1GB I think). First, I was wondering what the difference between Cache and Shared memory was in regards to PostgreSQL (and my application). What I have noticed is that only about 20-40% of my total CPU processing power is used during the most Read intensive parts of the application (Select unnest(array) etc). So secondly, I was wondering what I could do to improve this so that 100% of the CPU is used. Based on my observations, it does not seem to have anything to do with python or its GIL. Thanks

    Read the article

  • Column locking in innodb?

    - by mingyeow
    I know this sounds weird, but apparently one of my columns is locked. select * from table where type_id = 1 and updated_at < '2010-03-14' limit 1; select * from table where type_id = 3 and updated_at < '2010-03-14' limit 10; the first one would not finish running, while the second one completes smoothly. the only difference is the type_id Thanks in advance for your help - i have an urgent data job to finish, and this problem is driving me crazy

    Read the article

  • Maintenance Plan Reporting - Append To File - Clean Up?

    - by Adam J.R. Erickson
    Background: (SQL Server 2005, Standard Ed.) I have a maintenance plan running backups, taking a full backup 1/day, and t-log every 15 minutes. I have it set to create a text file report of each run, but that creates A LOT of files on the file server. These are hard to sort through, which makes them less useful. Question: There is an option in "Reporting and Logging" settings for appending all logs together, but how do you clean this out? If you're appending to the same log file every time, how should you make sure this file doesn't grow indefinitely? Is there a build-in function to clean out portions of appended logs like there is for cleaning out individual old log files?

    Read the article

  • How do I restore a database on a remote SQL server 2005 from a local backup?

    - by MatsT
    I have been given access to (parts of) a remote SQL Server 2005 with SQL Server authentication in order to be able to make changes to a database without involving other people who is not working on the project. The database have been created on my local machine. Is there any way to restore the remote database from a backup file on my local computer? I do not currently have access to the filesystem on the remote server. EDIT: To clarify, the access I have is that i can log in to the server via the SQL Server Management Studio. I have one connection to my local database server and one connection to the remote server. What I basically want to do is copy the database from one connection to the other.

    Read the article

  • How do I restore a database on a remote SQL server 2005 from a local backup?

    - by MatsT
    I have been given access to (parts of) a remote SQL Server 2005 with SQL Server authentication in order to be able to make changes to a database without involving other people who is not working on the project. The database have been created on my local machine. Is there any way to restore the remote database from a backup file on my local computer? I do not currently have access to the filesystem on the remote server. EDIT: To clarify, the access I have is that i can log in to the server via the SQL Server Management Studio. I have one connection to my local database server and one connection to the remote server. What I basically want to do is copy the database from one connection to the other.

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >