2023-05-21 04:43:08,221 44k INFO {'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 3, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 768, 'ssl_dim': 768, 'n_speakers': 1, 'speech_encoder': 'vec768l12', 'speaker_embedding': False}, 'spk': {'baobai': 0}, 'model_dir': './logs/44k'} 2023-05-21 04:43:16,737 44k INFO emb_g.weight is not in the checkpoint 2023-05-21 04:43:16,823 44k INFO Loaded checkpoint './logs/44k/G_0.pth' (iteration 0) 2023-05-21 04:43:18,005 44k INFO Loaded checkpoint './logs/44k/D_0.pth' (iteration 0) 2023-05-21 04:44:18,923 44k INFO ====> Epoch: 1, cost 70.71 s 2023-05-21 04:44:52,034 44k INFO ====> Epoch: 2, cost 33.11 s 2023-05-21 04:45:25,029 44k INFO ====> Epoch: 3, cost 33.00 s 2023-05-21 04:45:58,832 44k INFO ====> Epoch: 4, cost 33.80 s 2023-05-21 04:46:32,881 44k INFO ====> Epoch: 5, cost 34.05 s 2023-05-21 04:47:07,159 44k INFO ====> Epoch: 6, cost 34.28 s 2023-05-21 04:47:41,279 44k INFO ====> Epoch: 7, cost 34.12 s 2023-05-21 04:48:15,771 44k INFO ====> Epoch: 8, cost 34.49 s 2023-05-21 04:48:51,407 44k INFO ====> Epoch: 9, cost 35.64 s 2023-05-21 04:49:13,163 44k INFO Train Epoch: 10 [48%] 2023-05-21 04:49:13,170 44k INFO Losses: [2.5445778369903564, 2.83388090133667, 15.380541801452637, 18.88252830505371, 1.092060923576355], step: 200, lr: 9.98875562335968e-05, reference_loss: 40.73358917236328 2023-05-21 04:49:26,768 44k INFO ====> Epoch: 10, cost 35.36 s 2023-05-21 04:50:00,043 44k INFO ====> Epoch: 11, cost 33.27 s 2023-05-21 04:50:32,887 44k INFO ====> Epoch: 12, cost 32.84 s 2023-05-21 04:51:05,646 44k INFO ====> Epoch: 13, cost 32.76 s 2023-05-21 04:51:38,556 44k INFO ====> Epoch: 14, cost 32.91 s 2023-05-21 04:52:11,877 44k INFO ====> Epoch: 15, cost 33.32 s 2023-05-21 04:52:45,053 44k INFO ====> Epoch: 16, cost 33.18 s 2023-05-21 04:53:18,379 44k INFO ====> Epoch: 17, cost 33.33 s 2023-05-21 04:53:51,413 44k INFO ====> Epoch: 18, cost 33.03 s 2023-05-21 04:54:24,092 44k INFO ====> Epoch: 19, cost 32.68 s 2023-05-21 04:54:34,638 44k INFO Train Epoch: 20 [0%] 2023-05-21 04:54:34,639 44k INFO Losses: [2.2672250270843506, 3.07883882522583, 15.199647903442383, 18.87508201599121, 1.056928277015686], step: 400, lr: 9.976276699833672e-05, reference_loss: 40.477718353271484 2023-05-21 04:54:58,369 44k INFO ====> Epoch: 20, cost 34.28 s 2023-05-21 04:55:31,807 44k INFO ====> Epoch: 21, cost 33.44 s 2023-05-21 04:56:04,832 44k INFO ====> Epoch: 22, cost 33.03 s 2023-05-21 04:56:37,683 44k INFO ====> Epoch: 23, cost 32.85 s 2023-05-21 04:57:11,401 44k INFO ====> Epoch: 24, cost 33.72 s 2023-05-21 04:57:45,362 44k INFO ====> Epoch: 25, cost 33.96 s 2023-05-21 04:58:19,556 44k INFO ====> Epoch: 26, cost 34.19 s 2023-05-21 04:58:53,777 44k INFO ====> Epoch: 27, cost 34.22 s 2023-05-21 04:59:28,417 44k INFO ====> Epoch: 28, cost 34.64 s 2023-05-21 04:59:52,227 44k INFO Train Epoch: 29 [52%] 2023-05-21 04:59:52,228 44k INFO Losses: [2.6102354526519775, 2.5509371757507324, 15.050143241882324, 18.5263729095459, 0.6427398324012756], step: 600, lr: 9.965058998565574e-05, reference_loss: 39.38042449951172 2023-05-21 05:00:04,491 44k INFO ====> Epoch: 29, cost 36.07 s 2023-05-21 05:00:39,358 44k INFO ====> Epoch: 30, cost 34.87 s 2023-05-21 05:01:14,944 44k INFO ====> Epoch: 31, cost 35.59 s 2023-05-21 05:01:50,616 44k INFO ====> Epoch: 32, cost 35.67 s 2023-05-21 05:02:24,588 44k INFO ====> Epoch: 33, cost 33.97 s 2023-05-21 05:02:58,697 44k INFO ====> Epoch: 34, cost 34.11 s 2023-05-21 05:03:31,929 44k INFO ====> Epoch: 35, cost 33.23 s 2023-05-21 05:04:04,532 44k INFO ====> Epoch: 36, cost 32.60 s 2023-05-21 05:04:37,191 44k INFO ====> Epoch: 37, cost 32.66 s 2023-05-21 05:05:09,501 44k INFO ====> Epoch: 38, cost 32.31 s 2023-05-21 05:05:20,677 44k INFO Train Epoch: 39 [5%] 2023-05-21 05:05:20,678 44k INFO Losses: [2.1589901447296143, 2.6669301986694336, 16.74024772644043, 18.339496612548828, 0.8423145413398743], step: 800, lr: 9.952609679164422e-05, reference_loss: 40.74797821044922 2023-05-21 05:05:38,923 44k INFO Saving model and optimizer state at iteration 39 to ./logs/44k/G_800.pth 2023-05-21 05:05:43,429 44k INFO Saving model and optimizer state at iteration 39 to ./logs/44k/D_800.pth 2023-05-21 05:06:08,013 44k INFO ====> Epoch: 39, cost 58.51 s 2023-05-21 05:06:43,198 44k INFO ====> Epoch: 40, cost 35.18 s 2023-05-21 05:07:18,870 44k INFO ====> Epoch: 41, cost 35.67 s 2023-05-21 05:07:53,211 44k INFO ====> Epoch: 42, cost 34.34 s 2023-05-21 05:08:26,741 44k INFO ====> Epoch: 43, cost 33.53 s 2023-05-21 05:09:00,756 44k INFO ====> Epoch: 44, cost 34.01 s 2023-05-21 05:09:34,017 44k INFO ====> Epoch: 45, cost 33.26 s 2023-05-21 05:10:06,712 44k INFO ====> Epoch: 46, cost 32.69 s 2023-05-21 05:10:39,216 44k INFO ====> Epoch: 47, cost 32.50 s 2023-05-21 05:11:01,524 44k INFO Train Epoch: 48 [57%] 2023-05-21 05:11:01,528 44k INFO Losses: [2.5245370864868164, 2.759749174118042, 15.53822135925293, 18.24738121032715, 0.8092263340950012], step: 1000, lr: 9.941418589985758e-05, reference_loss: 39.87911605834961 2023-05-21 05:11:12,263 44k INFO ====> Epoch: 48, cost 33.05 s 2023-05-21 05:11:44,742 44k INFO ====> Epoch: 49, cost 32.48 s 2023-05-21 05:12:16,612 44k INFO ====> Epoch: 50, cost 31.87 s 2023-05-21 05:12:48,642 44k INFO ====> Epoch: 51, cost 32.03 s 2023-05-21 05:13:20,754 44k INFO ====> Epoch: 52, cost 32.11 s 2023-05-21 05:13:53,229 44k INFO ====> Epoch: 53, cost 32.47 s 2023-05-21 05:14:25,437 44k INFO ====> Epoch: 54, cost 32.21 s 2023-05-21 05:14:58,124 44k INFO ====> Epoch: 55, cost 32.69 s 2023-05-21 05:15:30,617 44k INFO ====> Epoch: 56, cost 32.49 s 2023-05-21 05:16:03,136 44k INFO ====> Epoch: 57, cost 32.52 s 2023-05-21 05:16:15,483 44k INFO Train Epoch: 58 [10%] 2023-05-21 05:16:15,484 44k INFO Losses: [2.4595236778259277, 2.8447391986846924, 16.389728546142578, 18.046951293945312, 0.5579648613929749], step: 1200, lr: 9.928998804478705e-05, reference_loss: 40.29890823364258 2023-05-21 05:16:36,572 44k INFO ====> Epoch: 58, cost 33.44 s 2023-05-21 05:17:09,212 44k INFO ====> Epoch: 59, cost 32.64 s 2023-05-21 05:17:41,885 44k INFO ====> Epoch: 60, cost 32.67 s 2023-05-21 05:18:15,056 44k INFO ====> Epoch: 61, cost 33.17 s 2023-05-21 05:18:48,935 44k INFO ====> Epoch: 62, cost 33.88 s 2023-05-21 05:19:23,333 44k INFO ====> Epoch: 63, cost 34.40 s 2023-05-21 05:19:57,813 44k INFO ====> Epoch: 64, cost 34.48 s 2023-05-21 05:20:32,971 44k INFO ====> Epoch: 65, cost 35.16 s 2023-05-21 05:21:08,468 44k INFO ====> Epoch: 66, cost 35.50 s 2023-05-21 05:21:33,538 44k INFO Train Epoch: 67 [62%] 2023-05-21 05:21:33,543 44k INFO Losses: [2.5141336917877197, 2.321308135986328, 13.527164459228516, 17.738521575927734, 0.9825926423072815], step: 1400, lr: 9.917834264256819e-05, reference_loss: 37.08372116088867 2023-05-21 05:21:44,291 44k INFO ====> Epoch: 67, cost 35.82 s 2023-05-21 05:22:18,049 44k INFO ====> Epoch: 68, cost 33.76 s 2023-05-21 05:22:51,054 44k INFO ====> Epoch: 69, cost 33.01 s 2023-05-21 05:23:23,243 44k INFO ====> Epoch: 70, cost 32.19 s 2023-05-21 05:23:55,543 44k INFO ====> Epoch: 71, cost 32.30 s 2023-05-21 05:24:27,802 44k INFO ====> Epoch: 72, cost 32.26 s 2023-05-21 05:24:59,750 44k INFO ====> Epoch: 73, cost 31.95 s 2023-05-21 05:25:31,765 44k INFO ====> Epoch: 74, cost 32.01 s 2023-05-21 05:26:03,995 44k INFO ====> Epoch: 75, cost 32.23 s 2023-05-21 05:26:36,249 44k INFO ====> Epoch: 76, cost 32.25 s 2023-05-21 05:26:49,329 44k INFO Train Epoch: 77 [14%] 2023-05-21 05:26:49,330 44k INFO Losses: [2.3342432975769043, 2.160710573196411, 12.683871269226074, 16.918319702148438, 0.9933038353919983], step: 1600, lr: 9.905443942579728e-05, reference_loss: 35.090450286865234 2023-05-21 05:26:59,032 44k INFO Saving model and optimizer state at iteration 77 to ./logs/44k/G_1600.pth 2023-05-21 05:27:01,667 44k INFO Saving model and optimizer state at iteration 77 to ./logs/44k/D_1600.pth 2023-05-21 05:27:25,363 44k INFO ====> Epoch: 77, cost 49.11 s 2023-05-21 05:27:57,887 44k INFO ====> Epoch: 78, cost 32.52 s 2023-05-21 05:28:29,962 44k INFO ====> Epoch: 79, cost 32.08 s 2023-05-21 05:29:02,076 44k INFO ====> Epoch: 80, cost 32.11 s 2023-05-21 05:29:34,715 44k INFO ====> Epoch: 81, cost 32.64 s 2023-05-21 05:30:07,051 44k INFO ====> Epoch: 82, cost 32.34 s 2023-05-21 05:30:39,190 44k INFO ====> Epoch: 83, cost 32.14 s 2023-05-21 05:31:11,160 44k INFO ====> Epoch: 84, cost 31.97 s 2023-05-21 05:31:43,221 44k INFO ====> Epoch: 85, cost 32.06 s 2023-05-21 05:32:07,280 44k INFO Train Epoch: 86 [67%] 2023-05-21 05:32:07,286 44k INFO Losses: [2.2493386268615723, 2.731492042541504, 13.587553024291992, 15.12157154083252, 0.6404907703399658], step: 1800, lr: 9.894305888331732e-05, reference_loss: 34.330448150634766 2023-05-21 05:32:15,872 44k INFO ====> Epoch: 86, cost 32.65 s 2023-05-21 05:32:48,137 44k INFO ====> Epoch: 87, cost 32.27 s 2023-05-21 05:33:20,556 44k INFO ====> Epoch: 88, cost 32.42 s 2023-05-21 05:33:52,810 44k INFO ====> Epoch: 89, cost 32.25 s 2023-05-21 05:34:25,058 44k INFO ====> Epoch: 90, cost 32.25 s 2023-05-21 05:34:57,181 44k INFO ====> Epoch: 91, cost 32.12 s 2023-05-21 05:35:29,385 44k INFO ====> Epoch: 92, cost 32.20 s 2023-05-21 05:36:01,609 44k INFO ====> Epoch: 93, cost 32.22 s 2023-05-21 05:36:34,026 44k INFO ====> Epoch: 94, cost 32.42 s 2023-05-21 05:37:06,407 44k INFO ====> Epoch: 95, cost 32.38 s 2023-05-21 05:37:20,710 44k INFO Train Epoch: 96 [19%] 2023-05-21 05:37:20,711 44k INFO Losses: [2.2139039039611816, 2.295823097229004, 9.69484806060791, 17.4899845123291, 0.738105058670044], step: 2000, lr: 9.881944960586671e-05, reference_loss: 32.43266296386719 2023-05-21 05:37:39,559 44k INFO ====> Epoch: 96, cost 33.15 s 2023-05-21 05:38:12,464 44k INFO ====> Epoch: 97, cost 32.91 s 2023-05-21 05:38:44,917 44k INFO ====> Epoch: 98, cost 32.45 s 2023-05-21 05:39:17,468 44k INFO ====> Epoch: 99, cost 32.55 s 2023-05-21 05:39:50,088 44k INFO ====> Epoch: 100, cost 32.62 s 2023-05-21 05:40:22,585 44k INFO ====> Epoch: 101, cost 32.50 s 2023-05-21 05:40:55,431 44k INFO ====> Epoch: 102, cost 32.85 s 2023-05-21 05:41:28,821 44k INFO ====> Epoch: 103, cost 33.39 s 2023-05-21 05:42:02,738 44k INFO ====> Epoch: 104, cost 33.92 s 2023-05-21 05:42:30,808 44k INFO Train Epoch: 105 [71%] 2023-05-21 05:42:30,809 44k INFO Losses: [2.2192392349243164, 2.7417397499084473, 13.63097858428955, 18.681087493896484, 0.5394753217697144], step: 2200, lr: 9.870833329479095e-05, reference_loss: 37.81251907348633 2023-05-21 05:42:38,700 44k INFO ====> Epoch: 105, cost 35.96 s 2023-05-21 05:43:13,054 44k INFO ====> Epoch: 106, cost 34.35 s 2023-05-21 05:43:47,619 44k INFO ====> Epoch: 107, cost 34.56 s 2023-05-21 05:44:24,249 44k INFO ====> Epoch: 108, cost 36.63 s 2023-05-21 05:44:59,453 44k INFO ====> Epoch: 109, cost 35.20 s 2023-05-21 05:45:32,699 44k INFO ====> Epoch: 110, cost 33.25 s 2023-05-21 05:46:05,362 44k INFO ====> Epoch: 111, cost 32.66 s 2023-05-21 05:46:37,904 44k INFO ====> Epoch: 112, cost 32.54 s 2023-05-21 05:47:10,343 44k INFO ====> Epoch: 113, cost 32.44 s 2023-05-21 05:47:43,227 44k INFO ====> Epoch: 114, cost 32.88 s 2023-05-21 05:47:59,052 44k INFO Train Epoch: 115 [24%] 2023-05-21 05:47:59,057 44k INFO Losses: [2.2330069541931152, 2.7372024059295654, 8.495346069335938, 17.723203659057617, 0.7708501219749451], step: 2400, lr: 9.858501725933955e-05, reference_loss: 31.95960807800293 2023-05-21 05:48:08,136 44k INFO Saving model and optimizer state at iteration 115 to ./logs/44k/G_2400.pth 2023-05-21 05:48:12,093 44k INFO Saving model and optimizer state at iteration 115 to ./logs/44k/D_2400.pth 2023-05-21 05:48:33,640 44k INFO ====> Epoch: 115, cost 50.41 s 2023-05-21 05:49:06,586 44k INFO ====> Epoch: 116, cost 32.95 s 2023-05-21 05:49:39,074 44k INFO ====> Epoch: 117, cost 32.49 s 2023-05-21 05:50:12,016 44k INFO ====> Epoch: 118, cost 32.94 s 2023-05-21 05:50:44,838 44k INFO ====> Epoch: 119, cost 32.82 s 2023-05-21 05:51:17,762 44k INFO ====> Epoch: 120, cost 32.92 s 2023-05-21 05:51:50,661 44k INFO ====> Epoch: 121, cost 32.90 s 2023-05-21 05:52:23,470 44k INFO ====> Epoch: 122, cost 32.81 s 2023-05-21 05:52:56,124 44k INFO ====> Epoch: 123, cost 32.65 s 2023-05-21 05:53:23,266 44k INFO Train Epoch: 124 [76%] 2023-05-21 05:53:23,267 44k INFO Losses: [2.200976848602295, 2.4772040843963623, 16.151382446289062, 17.823810577392578, 0.8057739734649658], step: 2600, lr: 9.847416455282387e-05, reference_loss: 39.45914840698242 2023-05-21 05:53:30,688 44k INFO ====> Epoch: 124, cost 34.56 s 2023-05-21 05:54:04,912 44k INFO ====> Epoch: 125, cost 34.22 s 2023-05-21 05:54:39,924 44k INFO ====> Epoch: 126, cost 35.01 s 2023-05-21 05:55:15,121 44k INFO ====> Epoch: 127, cost 35.20 s 2023-05-21 05:55:50,216 44k INFO ====> Epoch: 128, cost 35.10 s 2023-05-21 05:56:25,353 44k INFO ====> Epoch: 129, cost 35.14 s 2023-05-21 05:57:01,674 44k INFO ====> Epoch: 130, cost 36.32 s 2023-05-21 05:57:36,466 44k INFO ====> Epoch: 131, cost 34.79 s 2023-05-21 05:58:09,526 44k INFO ====> Epoch: 132, cost 33.06 s 2023-05-21 05:58:42,501 44k INFO ====> Epoch: 133, cost 32.97 s 2023-05-21 05:58:59,391 44k INFO Train Epoch: 134 [29%] 2023-05-21 05:58:59,396 44k INFO Losses: [2.4819719791412354, 2.301201105117798, 14.97828197479248, 16.09484100341797, 0.7705917358398438], step: 2800, lr: 9.835114106370493e-05, reference_loss: 36.626888275146484 2023-05-21 05:59:16,031 44k INFO ====> Epoch: 134, cost 33.53 s 2023-05-21 05:59:49,318 44k INFO ====> Epoch: 135, cost 33.29 s 2023-05-21 06:00:22,788 44k INFO ====> Epoch: 136, cost 33.47 s 2023-05-21 06:00:56,655 44k INFO ====> Epoch: 137, cost 33.87 s 2023-05-21 06:01:31,309 44k INFO ====> Epoch: 138, cost 34.65 s 2023-05-21 06:02:06,233 44k INFO ====> Epoch: 139, cost 34.92 s 2023-05-21 06:02:41,223 44k INFO ====> Epoch: 140, cost 34.99 s 2023-05-21 06:03:16,447 44k INFO ====> Epoch: 141, cost 35.22 s 2023-05-21 06:03:52,380 44k INFO ====> Epoch: 142, cost 35.93 s 2023-05-21 06:04:21,519 44k INFO Train Epoch: 143 [81%] 2023-05-21 06:04:21,524 44k INFO Losses: [1.8807615041732788, 2.6372532844543457, 14.652959823608398, 15.112489700317383, 0.5258215069770813], step: 3000, lr: 9.824055133639235e-05, reference_loss: 34.80928421020508 2023-05-21 06:04:28,625 44k INFO ====> Epoch: 143, cost 36.25 s 2023-05-21 06:05:01,690 44k INFO ====> Epoch: 144, cost 33.07 s 2023-05-21 06:05:34,433 44k INFO ====> Epoch: 145, cost 32.74 s 2023-05-21 06:06:07,368 44k INFO ====> Epoch: 146, cost 32.94 s 2023-05-21 06:06:39,959 44k INFO ====> Epoch: 147, cost 32.59 s 2023-05-21 06:07:12,854 44k INFO ====> Epoch: 148, cost 32.89 s 2023-05-21 06:07:45,489 44k INFO ====> Epoch: 149, cost 32.64 s 2023-05-21 06:08:18,144 44k INFO ====> Epoch: 150, cost 32.66 s 2023-05-21 06:08:50,747 44k INFO ====> Epoch: 151, cost 32.60 s 2023-05-21 06:09:23,202 44k INFO ====> Epoch: 152, cost 32.46 s 2023-05-21 06:09:40,817 44k INFO Train Epoch: 153 [33%] 2023-05-21 06:09:40,820 44k INFO Losses: [2.426485300064087, 2.5262508392333984, 14.231987953186035, 18.29201889038086, 0.7122567892074585], step: 3200, lr: 9.811781969958938e-05, reference_loss: 38.18899917602539 2023-05-21 06:09:50,503 44k INFO Saving model and optimizer state at iteration 153 to ./logs/44k/G_3200.pth 2023-05-21 06:09:58,273 44k INFO Saving model and optimizer state at iteration 153 to ./logs/44k/D_3200.pth 2023-05-21 06:10:00,551 44k INFO .. Free up space by deleting ckpt ./logs/44k/G_800.pth 2023-05-21 06:10:00,552 44k INFO .. Free up space by deleting ckpt ./logs/44k/D_800.pth 2023-05-21 06:10:17,278 44k INFO ====> Epoch: 153, cost 54.08 s 2023-05-21 06:10:52,325 44k INFO ====> Epoch: 154, cost 35.05 s 2023-05-21 06:11:27,070 44k INFO ====> Epoch: 155, cost 34.74 s 2023-05-21 06:12:02,019 44k INFO ====> Epoch: 156, cost 34.95 s 2023-05-21 06:12:37,932 44k INFO ====> Epoch: 157, cost 35.91 s 2023-05-21 06:13:13,223 44k INFO ====> Epoch: 158, cost 35.29 s 2023-05-21 06:13:47,471 44k INFO ====> Epoch: 159, cost 34.25 s 2023-05-21 06:14:20,359 44k INFO ====> Epoch: 160, cost 32.89 s