VCX: Create Connection Error (Item Not Found)


#1

After successfully initialising vcx using libindy 1.6.2 and the docker indy_pool from the indy-sdk/ci

I attempt to create a connection but the result is item not found

config:
{
  "agency_did": "Th7MpTaRZVRYnPiabds81Y",
  "agency_endpoint": "https://eas01.pps.evernym.com",
  "agency_verkey": "FYmoFw55GeQH7SRFa37dkx1d2dZ3zUF8ckg7wmL7ofN4",
  "genesis_path": "/Users/phillipgibb/repo/vcx-poc/docker_pool_transactions_genesis",
  "institution_did": "RbTrA8BX6gZdEW9LUyFQv7",
  "institution_logo_url": "http://oldmutual-dashboard.cnsnt.io/img/brand/consent.png",
  "institution_name": "APP",
  "institution_verkey": "EQUJdikaXBAXJgELqgwKNofJmcGq5B1rYbg9rp5bZLih",
  "remote_to_sdk_did": "Ptp3Y9A157yG2mrnzsMjCS",
  "remote_to_sdk_verkey": "DUhj8qTm6TtqZ78LqJHF9y8sTPWtXKz8FaFJewNGobG3",
  "sdk_to_remote_did": "8D3UWRyyxmUfrJB5viLKx8",
  "sdk_to_remote_verkey": "4vr9rLpy74L5AAEjfd4pKBfexU1xu4vqfu3GPGABkMD3",
  "exported_wallet_path":"/Users/phillipgibb/.indy_client/wallet",
  "indy_wallet_path":"/Users/phillipgibb/.indy_client/wallet",
  "wallet_key": "12345",
  "wallet_name": "acme",
  "pool_name": "Node1",
  "protocol_version": "2"
}

error:

2018-09-10T14:17:44.200Z [VCX-POC] info: ---------------
2018-09-10T14:17:44.201Z [VCX-POC] info: Intializing VCX
2018-09-10T14:17:44.201Z [VCX-POC] info: ---------------
TRACE|indy::api::payments           |                src/api/payments.rs:353 | indy_register_payment_method: >>> payment_method: 0x102320870
TRACE|indy::api::payments           |                src/api/payments.rs:371 | indy_register_payment_method: entities >>> payment_method: "null"
TRACE|indy::api::payments           |                src/api/payments.rs:401 | indy_register_payment_method: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:135 | PaymentsCommand command received
TRACE|indy::commands::payments      |           src/commands/payments.rs:269 | register_method >>> type_: "null", methods: PaymentsMethod { create_address: 0x10cc6c430, add_request_fees: 0x10cc8bce0, parse_response_with_fees: 0x10cc601e0, build_get_payment_sources_request: 0x10cc8f010, parse_get_payment_sources_response: 0x10cc89590, build_payment_req: 0x10cc990d0, parse_payment_response: 0x10cc7d1e0, build_mint_req: 0x10cc9ba20, build_set_txn_fees_req: 0x10cc79f20, build_get_txn_fees_req: 0x10cc815e0, parse_get_txn_fees_response: 0x10cc9b590, build_verify_payment_req: 0x10cc651b0, parse_verify_payment_response: 0x10cc8da80 }
TRACE|indy::services::payments      |           src/services/payments.rs:81  | register_payment_method >>> method_type: "null"
TRACE|indy::services::payments      |           src/services/payments.rs:83  | register_payment_method <<<
TRACE|indy::commands::payments      |           src/commands/payments.rs:274 | register_method << res: Ok(())
2018-09-10T14:17:44.226Z [VCX-POC] info: done
TRACE|indy::api::pool               |                    src/api/pool.rs:33  | indy_create_pool_ledger_config: >>> config_name: 0x102325ef0, config: 0x102325fe0
TRACE|indy::api::pool               |                    src/api/pool.rs:39  | indy_create_pool_ledger_config: entities >>> config_name: "Node1", config: Some("{\"genesis_txn\":\"/Users/phillipgibb/repo/vcx-poc/docker_pool_transactions_genesis\"}")
TRACE|indy::api::pool               |                    src/api/pool.rs:54  | indy_create_pool_ledger_config: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:115 | PoolCommand command received
DEBUG|indy::commands::pool          |               src/commands/pool.rs:139 | create >>> name: "Node1", config: Some("{\"genesis_txn\":\"/Users/phillipgibb/repo/vcx-poc/docker_pool_transactions_genesis\"}")
TRACE|indy::services::pool          |           src/services/pool/mod.rs:61  | PoolService::create Node1 with config Some("{\"genesis_txn\":\"/Users/phillipgibb/repo/vcx-poc/docker_pool_transactions_genesis\"}")
ERROR|indy::errors::indy            |                 src/errors/indy.rs:73  | Casting error to ErrorCode: Pool ledger config already exists Pool ledger config file with name "Node1" already exists
TRACE|indy::api::pool               |                    src/api/pool.rs:47  | indy_create_pool_ledger_config:
TRACE|indy::api::pool               |                    src/api/pool.rs:291 | indy_set_protocol_version: >>> protocol_version: 2
TRACE|indy::api::pool               |                    src/api/pool.rs:295 | indy_set_protocol_version: entities >>> protocol_version: 2
TRACE|indy::api::pool               |                    src/api/pool.rs:310 | indy_set_protocol_version: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:115 | PoolCommand command received
DEBUG|indy::commands::pool          |               src/commands/pool.rs:228 | set_protocol_version >>> version: 2
DEBUG|indy::commands::pool          |               src/commands/pool.rs:237 | set_protocol_version <<<
TRACE|indy::api::pool               |                    src/api/pool.rs:303 | indy_set_protocol_version:
TRACE|indy::api::pool               |                    src/api/pool.rs:89  | indy_open_pool_ledger: >>> config_name: 0x105602550, config: 0x0
TRACE|indy::api::pool               |                    src/api/pool.rs:95  | indy_open_pool_ledger: entities >>> config_name: "Node1", config: None
TRACE|indy::api::pool               |                    src/api/pool.rs:110 | indy_open_pool_ledger: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:115 | PoolCommand command received
DEBUG|indy::commands::pool          |               src/commands/pool.rs:159 | open >>> name: "Node1", config: None
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:466 | Pool::new name Node1, id 5, config PoolOpenConfig { timeout: 20, extended_timeout: 60, conn_limit: 5, conn_active_timeout: 5, preordered_nodes: [] }
DEBUG|indy::commands::pool          |               src/commands/pool.rs:174 | open <<<
TRACE|indy::services::pool::commander|     src/services/pool/commander.rs:22  | cmd_parts [[99, 111, 110, 110, 101, 99, 116], [5, 0, 0, 0], [255, 255, 255, 255]]
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:537 | received pool event: Some(CheckCache(5))
TRACE|indy::services::pool::merkle_tree_factory|src/services/pool/merkle_tree_factory.rs:27  | Restoring merkle tree from genesis
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:94  | ZMQNetworker::process_event: nodes_updated [RemoteNode { name: "Node3", public_key: [79, 5, 67, 174, 174, 29, 149, 231, 155, 21, 224, 85, 146, 120, 150, 148, 59, 245, 158, 37, 83, 26, 154, 135, 77, 138, 28, 147, 175, 47, 27, 119], zaddr: "tcp://127.0.0.1:9706", is_blacklisted: false }, RemoteNode { name: "Node1", public_key: [245, 162, 146, 125, 78, 184, 226, 60, 221, 1, 103, 194, 231, 134, 97, 57, 147, 89, 13, 171, 80, 230, 214, 139, 195, 130, 29, 243, 184, 195, 79, 31], zaddr: "tcp://127.0.0.1:9702", is_blacklisted: false }, RemoteNode { name: "Node2", public_key: [221, 7, 213, 25, 70, 73, 199, 226, 112, 254, 71, 75, 167, 252, 229, 6, 9, 164, 175, 172, 99, 45, 152, 229, 52, 56, 128, 121, 246, 160, 155, 14], zaddr: "tcp://127.0.0.1:9704", is_blacklisted: false }, RemoteNode { name: "Node4", public_key: [196, 24, 141, 31, 105, 237, 27, 65, 201, 62, 59, 55, 193, 242, 181, 53, 42, 157, 34, 231, 131, 64, 176, 60, 30, 210, 123, 110, 169, 191, 133, 26], zaddr: "tcp://127.0.0.1:9708", is_blacklisted: false }]
TRACE|indy::services::pool::request_handler|src/services/pool/request_handler.rs:223 | start catchup, ne: Some(SendAllRequest("{\"op\":\"LEDGER_STATUS\",\"txnSeqNo\":4,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"ledgerId\":0,\"ppSeqNo\":null,\"viewNo\":null,\"protocolVersion\":2}", "H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG", 60, None))
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:59  | sending new request
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:83  | send request in new conn
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:180 | PoolConnection::new: from nodes [RemoteNode { name: "Node3", public_key: [79, 5, 67, 174, 174, 29, 149, 231, 155, 21, 224, 85, 146, 120, 150, 148, 59, 245, 158, 37, 83, 26, 154, 135, 77, 138, 28, 147, 175, 47, 27, 119], zaddr: "tcp://127.0.0.1:9706", is_blacklisted: false }, RemoteNode { name: "Node1", public_key: [245, 162, 146, 125, 78, 184, 226, 60, 221, 1, 103, 194, 231, 134, 97, 57, 147, 89, 13, 171, 80, 230, 214, 139, 195, 130, 29, 243, 184, 195, 79, 31], zaddr: "tcp://127.0.0.1:9702", is_blacklisted: false }, RemoteNode { name: "Node2", public_key: [221, 7, 213, 25, 70, 73, 199, 226, 112, 254, 71, 75, 167, 252, 229, 6, 9, 164, 175, 172, 99, 45, 152, 229, 52, 56, 128, 121, 246, 160, 155, 14], zaddr: "tcp://127.0.0.1:9704", is_blacklisted: false }, RemoteNode { name: "Node4", public_key: [196, 24, 141, 31, 105, 237, 27, 65, 201, 62, 59, 55, 193, 242, 181, 53, 42, 157, 34, 231, 131, 64, 176, 60, 30, 210, 123, 110, 169, 191, 133, 26], zaddr: "tcp://127.0.0.1:9708", is_blacklisted: false }]
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:255 | send_request >> pe: Some(SendAllRequest("{\"op\":\"LEDGER_STATUS\",\"txnSeqNo\":4,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"ledgerId\":0,\"ppSeqNo\":null,\"viewNo\":null,\"protocolVersion\":2}", "H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG", 60, None))
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:319 | _send_msg_to_one_node >> idx 0, req_id H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG, req {"op":"LEDGER_STATUS","txnSeqNo":4,"merkleRoot":"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG","ledgerId":0,"ppSeqNo":null,"viewNo":null,"protocolVersion":2}
DEBUG|indy::services::pool::networker|     src/services/pool/networker.rs:331 | _get_socket: open new socket for node 0
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:325 | _send_msg_to_one_node <<
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:319 | _send_msg_to_one_node >> idx 1, req_id H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG, req {"op":"LEDGER_STATUS","txnSeqNo":4,"merkleRoot":"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG","ledgerId":0,"ppSeqNo":null,"viewNo":null,"protocolVersion":2}
DEBUG|indy::services::pool::networker|     src/services/pool/networker.rs:331 | _get_socket: open new socket for node 1
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:325 | _send_msg_to_one_node <<
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:319 | _send_msg_to_one_node >> idx 2, req_id H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG, req {"op":"LEDGER_STATUS","txnSeqNo":4,"merkleRoot":"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG","ledgerId":0,"ppSeqNo":null,"viewNo":null,"protocolVersion":2}
DEBUG|indy::services::pool::networker|     src/services/pool/networker.rs:331 | _get_socket: open new socket for node 2
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:325 | _send_msg_to_one_node <<
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:319 | _send_msg_to_one_node >> idx 3, req_id H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG, req {"op":"LEDGER_STATUS","txnSeqNo":4,"merkleRoot":"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG","ledgerId":0,"ppSeqNo":null,"viewNo":null,"protocolVersion":2}
DEBUG|indy::services::pool::networker|     src/services/pool/networker.rs:331 | _get_socket: open new socket for node 3
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:325 | _send_msg_to_one_node <<
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:285 | send_request <<
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:105 | PoolSM: from init to getting catchup target
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:537 | received pool event: Some(NodeReply("{\"viewNo\":null,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"txnSeqNo\":4,\"op\":\"LEDGER_STATUS\",\"ppSeqNo\":null,\"protocolVersion\":2,\"ledgerId\":0}", "Node1"))
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:248 | is_active >> time worked: Duration { secs: 0, nanos: 48189000 }
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:250 | is_active << true
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:537 | received pool event: Some(NodeReply("{\"ppSeqNo\":null,\"protocolVersion\":2,\"ledgerId\":0,\"viewNo\":null,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"txnSeqNo\":4,\"op\":\"LEDGER_STATUS\"}", "Node3"))
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:248 | is_active >> time worked: Duration { secs: 0, nanos: 50380000 }
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:250 | is_active << true
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:537 | received pool event: Some(NodeReply("{\"txnSeqNo\":4,\"viewNo\":null,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"ledgerId\":0,\"protocolVersion\":2,\"ppSeqNo\":null,\"op\":\"LEDGER_STATUS\"}", "Node2"))
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:248 | is_active >> time worked: Duration { secs: 0, nanos: 57317000 }
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:250 | is_active << true
TRACE|indy::services::pool::networker|     src/services/pool/networker.rs:94  | ZMQNetworker::process_event: nodes_updated [RemoteNode { name: "Node4", public_key: [196, 24, 141, 31, 105, 237, 27, 65, 201, 62, 59, 55, 193, 242, 181, 53, 42, 157, 34, 231, 131, 64, 176, 60, 30, 210, 123, 110, 169, 191, 133, 26], zaddr: "tcp://127.0.0.1:9708", is_blacklisted: false }, RemoteNode { name: "Node1", public_key: [245, 162, 146, 125, 78, 184, 226, 60, 221, 1, 103, 194, 231, 134, 97, 57, 147, 89, 13, 171, 80, 230, 214, 139, 195, 130, 29, 243, 184, 195, 79, 31], zaddr: "tcp://127.0.0.1:9702", is_blacklisted: false }, RemoteNode { name: "Node2", public_key: [221, 7, 213, 25, 70, 73, 199, 226, 112, 254, 71, 75, 167, 252, 229, 6, 9, 164, 175, 172, 99, 45, 152, 229, 52, 56, 128, 121, 246, 160, 155, 14], zaddr: "tcp://127.0.0.1:9704", is_blacklisted: false }, RemoteNode { name: "Node3", public_key: [79, 5, 67, 174, 174, 29, 149, 231, 155, 21, 224, 85, 146, 120, 150, 148, 59, 245, 158, 37, 83, 26, 154, 135, 77, 138, 28, 147, 175, 47, 27, 119], zaddr: "tcp://127.0.0.1:9706", is_blacklisted: false }]
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:684 | PoolSM: from getting catchup target to active
 INFO|indy::commands                |                src/commands/mod.rs:115 | PoolCommand command received
 INFO|indy::commands::pool          |               src/commands/pool.rs:69  | OpenAck handle 5, pool_id 5, result Ok(())
TRACE|indy::api::pool               |                    src/api/pool.rs:103 | indy_open_pool_ledger: pool_handle: 5
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:537 | received pool event: Some(NodeReply("{\"protocolVersion\":2,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"ppSeqNo\":null,\"ledgerId\":0,\"viewNo\":null,\"op\":\"LEDGER_STATUS\",\"txnSeqNo\":4}", "Node4"))
TRACE|indy::services::pool::pool    |          src/services/pool/pool.rs:384 | received reply from node "Node4": "{\"protocolVersion\":2,\"merkleRoot\":\"H6Xnsc74UifcrwrRT5n6VFiBzz8yqYkq8UDM8Ey9c7mG\",\"ppSeqNo\":null,\"ledgerId\":0,\"viewNo\":null,\"op\":\"LEDGER_STATUS\",\"txnSeqNo\":4}"
TRACE|indy::api::wallet             |                  src/api/wallet.rs:262 | indy_open_wallet: >>> command_handle: 4, config: 0x1051047e0, credentials: 0x105104840, cb: Some(0x109e14b50)
TRACE|indy::api::wallet             |                  src/api/wallet.rs:269 | indy_open_wallet: params config: "{\"id\":\"acme\"}", credentials: "_"
TRACE|indy::api::wallet             |                  src/api/wallet.rs:285 | indy_open_wallet: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:123 | WalletCommand command received
TRACE|indy::commands::wallet        |             src/commands/wallet.rs:170 | _open >>> config: "{\"id\":\"acme\"}", credentials: "_"
TRACE|indy::services::wallet        |         src/services/wallet/mod.rs:220 | open_wallet >>> config: "{\"id\":\"acme\"}", credentials: "_"
TRACE|indy::services::wallet        |         src/services/wallet/mod.rs:296 | open_wallet <<< res: 7
TRACE|indy::commands::wallet        |             src/commands/wallet.rs:174 | _open <<< res: 7
TRACE|indy::api::wallet             |                  src/api/wallet.rs:278 | indy_open_wallet: cb command_handle: 4 err: Success, handle: 7
2018-09-10T14:17:45.388Z [VCX-POC] info: VCX Result undefined
TRACE|indy::api::did                |                     src/api/did.rs:55  | indy_create_and_store_my_did: >>> wallet_handle: 7, did_json: 0x105104100
TRACE|indy::api::did                |                     src/api/did.rs:60  | indy_create_and_store_my_did: entities >>> wallet_handle: 7, did_json: "_"
TRACE|indy::api::did                |                     src/api/did.rs:77  | indy_create_and_store_my_did: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:119 | DidCommand command received
 INFO|indy::commands::did           |                src/commands/did.rs:131 | CreateAndStoreMyDid command received
DEBUG|indy::commands::did           |                src/commands/did.rs:196 | create_and_store_my_did >>> wallet_handle: 7, my_did_info_json: "_"
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:81  | create_my_did >>> my_did_info: MyDidInfo { did: None, seed: None, crypto_type: None, cid: None }
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:349 | convert_seed >>> seed: None
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:364 | convert_seed <<< res: None
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:118 | create_my_did <<< did: (Did { did: "HMw7NjHMRHD8tq63RazQEt", verkey: "9vCeA2LQWT4Req6j6B2BpJDCRxwzEjdHNHqFb4fjD6kE" }, Key { verkey: "9vCeA2LQWT4Req6j6B2BpJDCRxwzEjdHNHqFb4fjD6kE" })
DEBUG|indy::commands::did           |                src/commands/did.rs:215 | create_and_store_my_did <<< res: ("HMw7NjHMRHD8tq63RazQEt", "9vCeA2LQWT4Req6j6B2BpJDCRxwzEjdHNHqFb4fjD6kE")
TRACE|indy::api::did                |                     src/api/did.rs:68  | indy_create_and_store_my_did: did: "HMw7NjHMRHD8tq63RazQEt", verkey: "9vCeA2LQWT4Req6j6B2BpJDCRxwzEjdHNHqFb4fjD6kE"
TRACE|indy::api::crypto             |                  src/api/crypto.rs:335 | indy_crypto_auth_crypt: >>> wallet_handle: 7, sender_vk: 0x105026f20, recipient_vk: 0x105028b30, msg_data: 0x1050289f0, msg_len: 145
TRACE|indy::api::crypto             |                  src/api/crypto.rs:343 | indy_crypto_auth_crypt: entities >>> wallet_handle: 7, sender_vk: "4vr9rLpy74L5AAEjfd4pKBfexU1xu4vqfu3GPGABkMD3", recipient_vk: "DUhj8qTm6TtqZ78LqJHF9y8sTPWtXKz8FaFJewNGobG3", msg_data: [129, 167, 98, 117, 110, 100, 108, 101, 100, 145, 220, 0, 121, 204, 131, 204, 165, 64, 116, 121, 112, 101, 204, 130, 204, 164, 110, 97, 109, 101, 204, 170, 67, 82, 69, 65, 84, 69, 95, 75, 69, 89, 204, 163, 118, 101, 114, 204, 163, 49, 46, 48, 204, 166, 102, 111, 114, 68, 73, 68, 204, 182, 72, 77, 119, 55, 78, 106, 72, 77, 82, 72, 68, 56, 116, 113, 54, 51, 82, 97, 122, 81, 69, 116, 204, 172, 102, 111, 114, 68, 73, 68, 86, 101, 114, 75, 101, 121, 204, 217, 44, 57, 118, 67, 101, 65, 50, 76, 81, 87, 84, 52, 82, 101, 113, 54, 106, 54, 66, 50, 66, 112, 74, 68, 67, 82, 120, 119, 122, 69, 106, 100, 72, 78, 72, 113, 70, 98, 52, 102, 106, 68, 54, 107, 69], msg_len: 145
TRACE|indy::api::crypto             |                  src/api/crypto.rs:362 | indy_crypto_auth_crypt: <<< res: Success
 INFO|indy::commands                |                src/commands/mod.rs:107 | CryptoCommand command received
 INFO|indy::commands::crypto        |             src/commands/crypto.rs:106 | AuthenticatedEncrypt command received
DEBUG|indy::commands::crypto        |             src/commands/crypto.rs:178 | authenticated_encrypt >>> wallet_handle: 7, my_vk: "4vr9rLpy74L5AAEjfd4pKBfexU1xu4vqfu3GPGABkMD3", their_vk: "DUhj8qTm6TtqZ78LqJHF9y8sTPWtXKz8FaFJewNGobG3", msg: [129, 167, 98, 117, 110, 100, 108, 101, 100, 145, 220, 0, 121, 204, 131, 204, 165, 64, 116, 121, 112, 101, 204, 130, 204, 164, 110, 97, 109, 101, 204, 170, 67, 82, 69, 65, 84, 69, 95, 75, 69, 89, 204, 163, 118, 101, 114, 204, 163, 49, 46, 48, 204, 166, 102, 111, 114, 68, 73, 68, 204, 182, 72, 77, 119, 55, 78, 106, 72, 77, 82, 72, 68, 56, 116, 113, 54, 51, 82, 97, 122, 81, 69, 116, 204, 172, 102, 111, 114, 68, 73, 68, 86, 101, 114, 75, 101, 121, 204, 217, 44, 57, 118, 67, 101, 65, 50, 76, 81, 87, 84, 52, 82, 101, 113, 54, 106, 54, 66, 50, 66, 112, 74, 68, 67, 82, 120, 119, 122, 69, 106, 100, 72, 78, 72, 113, 70, 98, 52, 102, 106, 68, 54, 107, 69]
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:370 | validate_key >>> vk: "4vr9rLpy74L5AAEjfd4pKBfexU1xu4vqfu3GPGABkMD3"
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:392 | validate_key <<<
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:370 | validate_key >>> vk: "DUhj8qTm6TtqZ78LqJHF9y8sTPWtXKz8FaFJewNGobG3"
TRACE|indy::services::crypto        |         src/services/crypto/mod.rs:392 | validate_key <<<
ERROR|indy::errors::indy            |                 src/errors/indy.rs:73  | Casting error to ErrorCode: Item not found
TRACE|indy::api::crypto             |                  src/api/crypto.rs:354 | indy_crypto_auth_crypt: encrypted_msg: []
2018-09-10T14:17:45.399Z [VCX-POC] error: Invalid Connection Handle

#2

Hi Phillip,

For a connection from VerityUI to the enterprise agency, (https://eas01.pps.evernym.com), the pool that should be connected to is the STN (Sovrin Test Network). The genesis file is in Github, but that seems to be down this morning. If you ping me in sovin slack I’ll put it in there.

Mike Bailey


#3

alas, I do not have an account at the sovrin slack.

So what is this https://eas01.pps.evernym.com? A central agent registry that my agent myst have a relationship with before I can use sovrin with vcx? Surely that locks me into evernym if I use vcx?

Is there not a mock agency I can run that points to my local docker indy pool for for testing? Otherwise I need to ledger a DID and verkey onto the STN for testing - and I have no idea what that agency is doing?


#4

An agency (like eas01) is basically a traffic-forwarding system that allows two agents with non-fixed endpoint addresses to communicate. It also gives the advantage of being privacy-preserving, since entities will be unable to cross-correlate a user by his endpoint address. As the address would indicate, this is indeed one that is provided by Evernym.

I know of no agencies that are currently capable of linking to an arbitrary pool.


#5

so that means, until evernym provides it, one cannot use a local node pool for testing because the agency is not pointing to it, plus it is only pointing to the test network.

But what if my agents have fixed endpoint addresses, why do I need the agency.

also

  1. As a user, I might offer my endpoint for an agent to connect to me as part of a pairwise relationship between us. But that is between us, no one else. That endpoint is maybe my cloud agent endpoint.
  2. The same for the agent.

At what point is a user’s agent correlatable?

If there is a verifiable claim/credential that contains an agents did to prove that the agent issued it, then you could find the agent’s endpoint - but that is the point. No one is going to find the user’s did or endpoint in the claim.

Maybe if the agent stores the user’s did in it’s database and gets hacked?


#6

It may often make sense for a commercial entity to forego an agency for its agent endpoint. It is likely to have a fixed IP address, and is likely to want its DID to be known and tied to its public identity.

For an individual, a fixed address endpoint may be less common. In addition, privacy is important, and increasingly is becoming mandatory, with GSPR and similar initiatives. An area that privacy can be compromised with a public ledger is if entities can cross-correlate data to determine that an individual that does ‘A’ is the same individual that does ‘B’. If a fixed IP address is always used for a given individual, such cross-correlation is possible.

VCX is designed with security in mind from the start, so agencies have been built in. In the future, other implementations may be done by the community that relax these standards, but particularly in the case of individuals, we sincerely hope not.


#7

Another point: no PII or other sensitive information is stored on the agency.


#8

ok, so the act of provisioning keys with VCX is the telling of the agency what the agent endpoint is.
then, when connections are accepted etc, then it is done to the agency endpoint - thereby hiding the agent endpoint.

The agency then forwards the messages to the agent using the DIDs and verkeys shared between them.

So the same would be true for the user’s cloud agent - the endpoint would be the agency endpoint?

The Agency would ledger the required DIDs, ver key and endpoints onto the STN, but these would not correlate to any user or agent because no one, besides the agent or user, knows the pairwise relations.


#9

You’re welcome to join the Sovrin slack team and join the discussion.


#10

thanks Phil, I shall join and take part